View Full Version : My friend is a luddite, but not without cause
Robocommie
30th November 2009, 01:59
So, the other day my very best friend and I were discussing Socialism. He's a Leftwinger like myself, very much into Frantz Fanon and interested in the Frankfurt School, but really more into liberation theology than old fashioned Marxism. We agree very frequently on a majority of the things we discuss, but one of the things we keep disagreeing on is the importance of technology.
I was mentioning a post here where somebody was discussing automated industrialization, and the use of robotics to suit the purposes of a socialist society. He rejected that outright, said that robotic socialists were dumb. He dismisses a lot of what he sees as technophilia, saying that technology is seen by far too many as a miracle cure to underlying social problems, and that we can't fix society's ills with robots.
The thing is, he has something of a point. Industrialization historically did benefit the wealthy the most because after all THEY were in the only place to invest capital in these machines and direct them to their ends. That's basic Marxism. He points out, for example, that Eli Whitney's cotton gin was actually responsible for expanding American chattel slavery because it mechanized the plantation and made a larger workforce necessary.
Those are good points, but ultimately, as I told him, I feel that technology is all about making tools, and that tools are wholly amoral. They are designed to do what we make them to do. In capitalism, the high expense of mechanized production tends to keep the workers from being able to compete and blatantly promotes the exploitation of labor. However, under a socialist system where the workers control the means of production, those machines would be directed to suit the needs of the workers and of society - thus a better tool would be simply a better tool.
I thought I'd bring this up here, and get everyone's thoughts on mechanization. My friends anti-technological stance is at times very frustrating, as I think it's a knee-jerk response to those bourgeoisie elements of society that think technology under capitalism improves everyone's lives equally, people we've both argued against frequently. But I regret that our argument got a little heated and I enjoy seeing the diversity of Revleft's stances on things within the socialist umbrella.
Invincible Summer
30th November 2009, 02:21
Those are good points, but ultimately, as I told him, I feel that technology is all about making tools, and that tools are wholly amoral. They are designed to do what we make them to do. In capitalism, the high expense of mechanized production tends to keep the workers from being able to compete and blatantly promotes the exploitation of labor. However, under a socialist system where the workers control the means of production, those machines would be directed to suit the needs of the workers and of society - thus a better tool would be simply a better tool.
This.
Lots of people in the "left" tend to see technology as inherently evil.
Socialism/communism isn't about the glorification of manual labour. It shouldn't be about making the workers of the world do backbreaking work for the rest of their lives... to maintain a "gloriously proletarian" existence or something ridiculous. But people still have this idea about what communism is.
IMO, communism is about enriching the lives and raising the standard of living for everyone. Technological progress (worker-controlled, of course) is the most efficient way (if not the only way) for this to happen. How can the working classes enjoy the fruits of their labour if they work constantly?
CELMX
30th November 2009, 04:24
Yes, the automation of labor would be wonderful! I mean, no one wants to work, right? It would make a communist society MUCH easier to achieve and a revolution much easier if there were "robots." Then, people won't have to work anymore, there will be enough to go around without working people to the bone, and happiness:rolleyes:, yay!
anyhoo, as you can tell, i'm pro-technology. And to sum everything up, robots = good for the revolution! (and the soul)
Tatarin
30th November 2009, 06:11
He dismisses a lot of what he sees as technophilia, saying that technology is seen by far too many as a miracle cure to underlying social problems, and that we can't fix society's ills with robots.
Yes, if he sees the world as it is today, because newer and faster technologies won't make it easier for the working class. However, many social problems would vanish in a communist society, even without that much use of technology. I mean, the few problems left are that of difficult work and medicine in such a society.
He points out, for example, that Eli Whitney's cotton gin was actually responsible for expanding American chattel slavery because it mechanized the plantation and made a larger workforce necessary.
But what does this have to do with technology itself? It is the type of society that affects how progress should work, which also directs the benefits to the few, that is ultimately the cause. In itself, globalization is a good thing - more open borders, more connectivity to other people much farther away than ever before, and so on. But as we all know, this is mainly to benefit global capital. In other words, it isn't internationalism that is the core problem, but those who form the ends in which internationalism is used.
Another important point is that technology develops to make out lives easier. Even non-technological creatures use skills in an interesting parallel to us using technology. For example, the net of a spider would be pretty useless without trees to attach it to. Yes, that is evolution, but still brings out the point that life for the spider is considerably easier than it having to develop muscles and other skills and hunt for food.
So, of course humans have developed all kinds of tools and skills to make life easy, well, at least in the ancient times. As society developed there came clan leader and king, and the rest is history. I mean, we did it once, so why wouldn't we develop it again if something really drastic happened to the world?
This is another thing that is interesting when it comes to the discussion of anti-technologists. That development can not stop. Somewhere, someone will create the first computer, the first robot, spaceship, etc. Luddites hold the stance that the mechanized society is bad, but what do they propose to do against it? It is something that every single person on earth would have to agree on in order for it to work, and who is going to decide what technology is to be used and what to not use? Who would stop them? You could take a look on my blog about primitivism for a whole argument against stopping, or trying to stop, such a development. :)
New Tet
30th November 2009, 07:16
So, the other day my very best friend and I were discussing Socialism. He's a Leftwinger like myself, very much into Frantz Fanon and interested in the Frankfurt School, but really more into liberation theology than old fashioned Marxism. We agree very frequently on a majority of the things we discuss, but one of the things we keep disagreeing on is the importance of technology.
I was mentioning a post here where somebody was discussing automated industrialization, and the use of robotics to suit the purposes of a socialist society. He rejected that outright, said that robotic socialists were dumb. He dismisses a lot of what he sees as technophilia, saying that technology is seen by far too many as a miracle cure to underlying social problems, and that we can't fix society's ills with robots.
The thing is, he has something of a point. Industrialization historically did benefit the wealthy the most because after all THEY were in the only place to invest capital in these machines and direct them to their ends. That's basic Marxism. He points out, for example, that Eli Whitney's cotton gin was actually responsible for expanding American chattel slavery because it mechanized the plantation and made a larger workforce necessary.
Those are good points, but ultimately, as I told him, I feel that technology is all about making tools, and that tools are wholly amoral. They are designed to do what we make them to do. In capitalism, the high expense of mechanized production tends to keep the workers from being able to compete and blatantly promotes the exploitation of labor. However, under a socialist system where the workers control the means of production, those machines would be directed to suit the needs of the workers and of society - thus a better tool would be simply a better tool.
I thought I'd bring this up here, and get everyone's thoughts on mechanization. My friends anti-technological stance is at times very frustrating, as I think it's a knee-jerk response to those bourgeoisie elements of society that think technology under capitalism improves everyone's lives equally, people we've both argued against frequently. But I regret that our argument got a little heated and I enjoy seeing the diversity of Revleft's stances on things within the socialist umbrella.
The term, I think is automation.
Modern technology has outgrown its capitalist cocoon. The social and economic forms under and over which it was shaped are too small and constricting.
Capitalism can no longer deal in a progressive manner with the social consequences of its own technical advances. So it markets the illusion of its unrealized potential to serve as an "opium of the masses". Politicians and pundits frequently extol "Technology" as a cure-all, a panacea, to all problems political and economic. Hell, technology "creates" jobs! Never mind the robots that currently replace groups of workers in the machinery assembly industries, okay?
Meanwhile, to achieve capitalist global objectives, technology is used by its masters as a weapon of expansion, conquest and, if need be, destruction in foreign lands where the factor of labor is cheap, abundant and accustomed to deplorable conditions and treatment.
Instantly I am reminded of the two most extreme consequences of the worship of technology by the ruling classes of our times: The holocaust of European Jewry and the destruction and massacre of Hiroshima and Nagasaki. Technology in the hands of warring capitalist factions is a terrible thing to behold.
http://www.youtube.com/watch?v=bu1CwE59Pho&feature=related
ZeroNowhere
30th November 2009, 07:46
I'm fairly sure the Luddites were a movement in the early 19th Century made up mainly of industrial workers and textile artisans who sabotaged stocking frames and such due to them leading to them losing their jobs and reducing their standards of living, seeing in the stocking frames an embodiment of capitalist social relations. If your friend is a Luddite, ask them for their time machines.
ÑóẊîöʼn
30th November 2009, 19:52
Instantly I am reminded of the two most extreme consequences of the worship of technology by the ruling classes of our times: The holocaust of European Jewry and the destruction and massacre of Hiroshima and Nagasaki. Technology in the hands of warring capitalist factions is a terrible thing to behold.
Err, what? "Worship of technology" (what the hell does that even mean anyway?) had nothing to do with the anti-Semitic slaughter during the Holocaust - it was a consequence of a "tradition" in Europe of scapegoating Jews that goes back centuries. Industrial technology made it easier to gather and kill large numbers of people, but it did not provide the motivation, which had been festering and sporadically erupting for a long time before anyone even concieved of an industrial production line.
As for Hiroshima and Nagasaki, my understanding is that they were a show of strength directed implicitly at the USSR as well as a means of cowing the Japanese. In other words, the decision to nuke two cities was the consequence of politics, and again technology merely provided the tools.
There's nothing inherent to technology which makes it evil - railways can be used to transport food to where it's needed, or to take political dissidents and despised minorities to their deaths. Nuclear bombs can be used to slaughter millions or to seed new life out in the stars.
Blaming the use of technology for atrocities is not only intellectually vacuous, it shifts the culpability from people to machines. Since we've yet to build a machine that thinks for itself and put it on trial, blaming technology makes no sense morally speaking.
bcbm
30th November 2009, 21:40
as i understand the op, it seems like the critique being raised isn't against technology, but against the idea that technology will solve all of the problems of capitalism or be essential to building communism.
ÑóẊîöʼn
30th November 2009, 21:55
as i understand the op, it seems like the critique being raised isn't against technology, but against the idea that technology will solve all of the problems of capitalism or be essential to building communism.
Technology alone cannot solve the problems of capitalism, because they extend to areas outside of technology.
But technology will be essential for a modern communist society. The fantasy that we can go back to a hunter-gatherer lifestyle is just that - a fantasy. Moreover, were such a dream to become reality, it would be necessity have to be through nightmarish agencies.
bcbm
30th November 2009, 23:51
Technology alone cannot solve the problems of capitalism, because they extend to areas outside of technology.
But technology will be essential for a modern communist society. The fantasy that we can go back to a hunter-gatherer lifestyle is just that - a fantasy. Moreover, were such a dream to become reality, it would be necessity have to be through nightmarish agencies.
what does anything i just said have to do with wanting to return to a hunter-gatherer lifestyle?
ÑóẊîöʼn
1st December 2009, 20:46
what does anything i just said have to do with wanting to return to a hunter-gatherer lifestyle?
Because that's what "communism" without technology looks like? Anything in between that and high-tech communism will have the problem of scarcity, which makes communist societies non-viable.
Technocrat
1st December 2009, 21:00
Those are good points, but ultimately, as I told him, I feel that technology is all about making tools, and that tools are wholly amoral. They are designed to do what we make them to do. In capitalism, the high expense of mechanized production tends to keep the workers from being able to compete and blatantly promotes the exploitation of labor. However, under a socialist system where the workers control the means of production, those machines would be directed to suit the needs of the workers and of society - thus a better tool would be simply a better tool.Yep, you nailed it. If your friend doesn't accept this argument I would suspect his position is not entirely based in rational thinking.
The Luddites first appeared as technology began taking people's jobs and thus their livelihoods. But why did they suffer as a result of this? Because of the social order present at the time - capitalism/the market system/the price system. Because those who *owned* the technology reaped all the benefits.
New Tet
1st December 2009, 21:41
Err, what? "Worship of technology" (what the hell does that even mean anyway?) had nothing to do with the anti-Semitic slaughter during the Holocaust - it was a consequence of a "tradition" in Europe of scapegoating Jews that goes back centuries. Industrial technology made it easier to gather and kill large numbers of people, but it did not provide the motivation, which had been festering and sporadically erupting for a long time before anyone even concieved of an industrial production line.
Are you arguing that capitalism and its ruling class are not directly responsible for the horrors in inflicts on present-day humanity? Or that the capitalist fetish of technology and the myth of its efficacy is not often invoked in the course of inflicting those horrors?
If so, I disagree.
As for Hiroshima and Nagasaki, my understanding is that they were a show of strength directed implicitly at the USSR as well as a means of cowing the Japanese. In other words, the decision to nuke two cities was the consequence of politics, and again technology merely provided the tools.
True but superficial, I think.
Again, I cite the A-Bomb as evidence that the ruling classes of our days believe--and want us to believe--that there is a technological solution to every problem, including war. The Nazis were big on technology and enlisted the help of IBM, General Motors, Krupp, Farben and so on to systematize and expedite in industrial form the slaughter of millions of people.
At the service of what, besides exploitation, is technology under capitalism if not the development of armaments and weapons of war?
There's nothing inherent to technology which makes it evil - railways can be used to transport food to where it's needed, or to take political dissidents and despised minorities to their deaths. Nuclear bombs can be used to slaughter millions or to seed new life out in the stars.
"Seed[ing] new life out in the stars" sounds pretty but at this point still belongs in the pages of Asimov and Clarke.
Note that I did not write anywhere that it was technology per se that was to blame, as you mistakenly insinuate above.
Blaming the use of technology for atrocities is not only intellectually vacuous, it shifts the culpability from people to machines. Since we've yet to build a machine that thinks for itself and put it on trial, blaming technology makes no sense morally speaking.
I agree. Instead we should blame capitalism for its crimes and understand and avoid the mythos of its technology.
bcbm
1st December 2009, 22:26
Because that's what "communism" without technology looks like? Anything in between that and high-tech communism will have the problem of scarcity, which makes communist societies non-viable.
where did i say anything about having communism without technology? i said that technology will not be essential in building it, that is, it will be built by people not robotics.
Vanguard1917
2nd December 2009, 00:26
Environmentalists: don't label them Luddites (http://www.spiked-online.com/index.php/site/article/4299/)
Using the L-word to describe today’s middle-class eco-miserabilists is an insult to the nineteenth-century radicals who fought for their rights and dignity.
Lord Testicles
6th December 2009, 17:42
"Seed[ing] new life out in the stars" sounds pretty but at this point still belongs in the pages of Asimov and Clarke.
What are talking you about, we have had nuclear propulsion since the 50's.
Technocrat
6th December 2009, 23:15
Man's use of technology is what distinguishes him from the apes. Primitivists don't get it - humans use technology because it is human nature to do so. If we didn't use technology, we would just be another kind of chimpanzee.
pastradamus
7th December 2009, 14:34
Do us technophobes count as Luddites? :lol: Im of the believe that the simpler something is the better it is - less moving parts, user friendlyness etc, the basic Kalashnikov docterine. When someone talks about Random access Memory, CPU's and all other associated words my mind goes to sleep.
Luisrah
7th December 2009, 17:57
Do us technophobes count as Luddites? :lol: Im of the believe that the simpler something is the better it is - less moving parts, user friendlyness etc, the basic Kalashnikov docterine. When someone talks about Random access Memory, CPU's and all other associated words my mind goes to sleep.
I know what you mean.
Though those things are complicated (but good thing that there are a lot of people who like working with that), what bugs me most is technology combined with Capitalism.
You have 50 brands of cell phones, and 50 different chargers, for example.
If you have a Nokia and a Samsung charger and have a Samsung cell phone, if you break the Samsung charger, you can't use the Nokia one, you have to buy another charger.
omg that bugs me so much, because it happens with everything else too. :mad:
pastradamus
7th December 2009, 18:23
I know what you mean.
Though those things are complicated (but goo thing that there are a lot of people who like working with that), what bugs me most if technology combined with Capitalism.
You have 50 brands of cell phones, and 50 different chargers, for example.
If you have a Nokia and a Samsung charger and have a Samsung cell phone, if you break the Samsung charger, you can't use the Nokia one, you have to buy another charger.
omg that bugs me so much, because it happens with everything else. :mad:
Yes sir. Mingle Capitalism and Technology together and you've a right pain in the arse. :thumbdown:
Invincible Summer
8th December 2009, 06:12
We're talking about technology as a neutral tool here, but what about science (which is closely intertwined)? Is science value-free, amoral, apolitical?
I would say that it is not, but in the sense that it is affected and reflects the focii of the societal system it is created/used in
Sov
17th December 2009, 10:08
How many anti-technologists are ready to actually give up technology, e.g. air conditioning and central heating?
Maybe they can lead by example?
I know of quite a few anti-technologists who rarely leave urban areas with high-tech infrastructure. They tend to despise "hicks" and huntin' and fishin' and all that. Yet who is less reliant on technology, dirty hillbillies who live out in the boondocks and can hunt with a knife or even their bare hands, or urban "Luddites" who live in apartments/dormitories/houses with central air and eat only industrially processed food?
If they were really to follow through with their programme, primitivists would make those dirty deer-huntin' "hicks" look urbane and high-tech by comparison. But instead, they tend to live right in the heart of ultramodern urban society and avoid socializing with those who don't.
This post applies to the "crazy" primitivists who call for actually returning to a pre-technological existence (for John Zerzan, even the simplest art is too technological and a tribal shaman is overly cultured!). This doesn't mean I'm a brainwashed geek who subscribes to scientism (as opposed to open-minded science) or worships technology. There is much to criticize (and oppose!) in techno-bureaucratic centralization, state/capitalist (mis)application of technology and lousy prioritization of technological development. The key though is in who makes and uses the tools, and how the tools are used, not the existence of tools themselves.
I've talked to a primitivist feminist who, like Millett (referencing Engels) traces patriarchy back to the discovery of paternity. Unlike Millett or Engels, she has the attitude that everything after this discovery has been tainted by the patriarchy and needs to be scrapped so people can start over again. However paternity cannot be "undiscovered." The problem was not the discovery, but rather the misinterpretation and misapplication of that knowledge. The solution is not in abolishing technology, but in abolishing patriarchy such as by de-emphasizing paternity, abolishing paternal privileges, making all relationships voluntary, and winning the sexual revolution. Then the patriarchal nuclear family will no longer be sustainable, and certain technologies could actually be harnessed to further develop and promote communal alternatives. We've already seen that technology (e.g. the pill) can work against the patriarchy, not just for it. Likewise capitalism, the state, bureaucratic centralization, etc.
Robocommie
17th December 2009, 10:48
as i understand the op, it seems like the critique being raised isn't against technology, but against the idea that technology will solve all of the problems of capitalism or be essential to building communism.
That's basically it, my friend has had a lot of irritation towards the entire concept building for quite some time because of the number of liberals or centrists he's talked to who cluelessly extol the virtues of scientific discovery as a solution to society's ills, completely ignoring that in the world we live in, just who gets the benefits of modern technology is most seriously slanted towards the first world. It's something I've felt a lot as well, because it's extremely frustrating to hear people talk about the importance of building a colony on the moon when there are many places around the world without a decent sewage treatment system.
ckaihatsu
17th December 2009, 13:47
and winning the sexual revolution.
And would that be winning the sexual revolution, *blow* by *blow* -- ???
x D
If your friend is a Luddite, ask them for their time machines.
Uh, *no*, because, being Luddites, they *destroyed* them already -- *duh*!
x D
ckaihatsu
17th December 2009, 14:27
When someone talks about Random access Memory, CPU's and all other associated words my mind goes to sleep.
Well, speaking on behalf of random-access memory, CPUs, and associated hardware, they're pretty pissed that you find them uninteresting.
= D
Im of the believe that the simpler something is the better it is
I wish I could agree with you in a comradely way, but on this *technical* point, I can't. There are some functions, enabled *only* by complex digital technology, that we humans happen to find incredible and satisfying. How about portable, crystal-clear music without rewinding? Perhaps photos that you can see *immediately*, without waiting for chemical-based developing?
These processes require a certain *complexity* and *high-capacity* of digital components, not the least of which is microchip design.
But all that aside, I think the reason why many people get understandably frustrated with many technological consumer items is for the non-standards thing mentioned in this thread -- also there was a period earlier in which basic consumer conveniences were *gradually* marketed and then *slowly* introduced onto the market, thus bleeding consumers' wallets over every little incremental improvement. (Remember TVs *without* remote controls?)
*Now*, here, a decade into the new millennium, what we're experiencing is something almost akin to sensory overload, because we've passed the point of being slowed down due to mechanical fine-tuning. Now *everything* is quite user-friendly and enabling, without hitting any technical, distracting speed bumps. An expansive *complexity* of consumer / life *choices* has opened up to us, due to the maturing and mainstreaming of the Internet, and of computer technology in general.
With rampant multitasking our computer-mediated experience can so easily become complexified that the return to a *simpler* user -- even human -- mode of life will seem about as exciting as watching grass grow. In dealing with a wider array of implements we have to become better at *lateral* and *complex* thinking (juggling), since there are no longer any sustained pauses from one task to the next, which we would conventionally experience as "simpler" or "linear".
The disclaimer here, of course, is that this is strictly a *technical* discussion, outside of any political or labor context.
Chris
--
--
--
___
RevLeft.com -- Home of the Revolutionary Left
www.revleft.com/vb/member.php?u=16162
Photoillustrations, Political Diagrams by Chris Kaihatsu
community.webshots.com/user/ckaihatsu/
3D Design Communications - Let Your Design Do Your Footwork
ckaihatsu.elance.com
MySpace:
myspace.com/ckaihatsu
CouchSurfing:
tinyurl.com/yoh74u
-- Tearing up more shit than a weightless astronaut on the toilet --
ckaihatsu
17th December 2009, 14:45
IMO, communism is about enriching the lives and raising the standard of living for everyone.
While I'd be the *last* person to argue against a materialist -- and even materialistic -- stance, the funny thing here is that you're balancing our *entire ideology* on the single point of "raising the standard of living for everyone".
Certainly I *agree* with this principle, but the funny part is that our technical / political orientation is also our *limitation*. We, by this principle, *can't prescribe* *what* that "standard of living for everyone" is. Yes, we can fervently agitate for *humane* and *relatively better* standards, in general, but communism *isn't* a *specific*-oriented thing.
With communism we can, at best, reshape society so that every last person *has access* to better drinking water, habitable housing, modern conveniences, and so on, but, given these options, *not every* person *may want* the same ease of living as others. On this humanistic / humanities point a political perspective is silent since politics is about provisioning *in general*, and *not* specifically -- *that* has to be left up to the individual in their / our individual experience of life and living.
ckaihatsu
17th December 2009, 15:05
Technology alone cannot solve the problems of capitalism, because they extend to areas outside of technology.
I wouldn't mind exploring this a bit, because I have to admit that I continue to harbor *one* single hair -- not even a strand -- of "libertarian" thought on this matter.
Given that our basic human needs of life and livelihood are finite -- food, water, housing, electricity, education, etc., and that the people of the world's working class far outnumber those of the ruling class, couldn't there be some point at which technology alone would make politics obsolete altogether?
In other words, if all individual self-sufficiency some day became as simple as growing a single plant out of the ground, then society itself could very well lose its political component altogether. Without a fundamental interdependency there could be no monopolies and no power bases. Organizations of cultish control could easily be walked away from or overcome and dissolved with a like, countervailing force. No group could have any grounds for materially extorting anyone else, since individual life's own upkeep would be as simple as growing a dandelion.
Hey -- with genetic engineering you just never know!
Robocommie
18th December 2009, 01:12
Hey -- with genetic engineering you just never know!
Yeah... but the idea of planting a seed and growing a house, and planting other seeds and growing your clothes and your food and so on and so forth, it's all so waaay the hell out there. Like, practically Space Baby out there. We might as well not even think about it since we're going to be dealing with scarcity for a good long while.
But there are technological ideas I've been toying with in my brain that I think a future society could benefit from immensely, advances in urban planning and the like. Things like increased use of geo-thermal energy, arcologies, vertical farms. Actually I think you might be the one who told me about vertical farms.
ckaihatsu
18th December 2009, 06:14
it's [...] Space Baby out there.
Babies! In. Spaaaaaaaaaaaaaaaaaaaaaaaaaaaacccceeeeeeeeeee!
Things like increased use of geo-thermal energy, arcologies, vertical farms. Actually I think you might be the one who told me about vertical farms.
Shit, *maybe* -- I know it's come up in previous discussions around here. And, yeah, they're far more likely as societal, *civilizational* steps towards the liberation of humanity from oppressive political dependence and toxifying sources of energy.
But -- and this is by no means a *principled* argument -- there may be more-*individual*-scale solutions not too far off that might do for basic human needs what the cell phone has done for personal communications -- something like that.... (Okay, starting to hold breath... *now*!)
= )
Robocommie
18th December 2009, 19:13
Babies! In. Spaaaaaaaaaaaaaaaaaaaaaaaaaaaacccceeeeeeeeeee!
Damn you Kaihatsu. ;)
ckaihatsu
18th December 2009, 19:25
planting a seed and growing a house, and planting other seeds and growing your clothes and your food
Sounds like a Marxist sci-fi movie script just *waiting* to be written...! (*Besides* the babies in space part of the premise, that is...!)
= D
Dean
23rd December 2009, 23:40
There's nothing wrong with technology; its a valuable resource which will be valuable to any communist society. At the same time, it must be understood that technology can and is being used against the working class of the world; in other words, it is by no means intrinsically positive for society. In this day and age, starvation is increasingly present and hunger even in rich nations is a powerful force, despite the huge advancements whe have made in agriculture and production and distribution of food.
Until we have communist revolution, which is necessarily a human-defined, not technologically-defined mode of social organization, technology will be used for plenty of vile capitalist interests. I think you're friend is right to think that communist is generated by people, not by technology or "lack of scarcity," but at the same time technology must be embraced as a valuable tool in many facets of production and distribution.
Quail
24th December 2009, 15:31
Technology is important, and we have pretty much come to depend on it in most of society. I would personally find life without any technology very difficult, and probably wouldn't survive too long.
I don't see a society at any time where technology does everything for us.
-If everything is done for you, will you have enough of a sense of purpose? Would you feel as though you had achieved something with your life?
-There are some things that computers just can't do as well as real people. For example teaching. I can read all the maths I like online, but generally I need someone to explain the ins and outs of it and respond on a one-to-one level to my questions. I'm sure that computers (or robots) could be programmed to do this, but a computer or robot just doesn't have the same understanding of people and how it feels to be learning as an actual person does. Jobs such as nursing, counselling, etc are similar.
-There will obviously still be people working, and probably doing pretty mundane jobs tbh, because you need people watching to make sure that the machinery is working, performing maintenance, fixing things.
Technology can make life easier, but a world where it can do absolutely everything seems, to me, neither desirable nor particularly plausible.
ckaihatsu
24th December 2009, 22:44
This is going to be admittedly an *academic* exercise -- I have no *severe* disagreements with anything you're saying here....
Technology is important, and we have pretty much come to depend on it in most of society. I would personally find life without any technology very difficult, and probably wouldn't survive too long.
Tool-use is *inseparable* from humanity, in general -- the only tricky part is establishing a proper, post-capitalist material societal basis for its selection so that we're not developing the kinds of tools that only aid in *competitive* acts, which, by definition, are *destructive* or wasteful of human efforts.
I don't see a society at any time where technology does everything for us.
Technology should simply do whatever it is we *don't* want to do -- want some art decoration for your wall that goes well with the carpet? Do a web search of images with the keywords of "beige" and "painting" and see what comes up. Or develop a computer algorithm that does painting-like images, then print one out and frame it. But if one feels like *expressing* themselves through the painterly medium then one would have to pick up a brush and paints and do something artistic themselves.
-If everything is done for you, will you have enough of a sense of purpose? Would you feel as though you had achieved something with your life?
Hey -- until the world is *perfect* there's always politics...! = D
-There are some things that computers just can't do as well as real people. For example teaching. I can read all the maths I like online, but generally I need someone to explain the ins and outs of it and respond on a one-to-one level to my questions. I'm sure that computers (or robots) could be programmed to do this, but a computer or robot just doesn't have the same understanding of people and how it feels to be learning as an actual person does. Jobs such as nursing, counselling, etc are similar.
At the same time there's a fine line between being waited on and just being lazy oneself -- it's a toughie....
-There will obviously still be people working, and probably doing pretty mundane jobs tbh, because you need people watching to make sure that the machinery is working, performing maintenance, fixing things.
Hopefully *not*. This is the area in which there should be the *least* amount of controversy over the adoption of machinery -- in eliminating *gruntwork*, even for the relatively higher-level function of fixing *existing* machinery -- why can't we just get another machine to do that?
Technology can make life easier, but a world where it can do absolutely everything seems, to me, neither desirable nor particularly plausible.
There's *nothing* exciting, challenging, or adventurous about doing something that could easily be done by a machine -- one would *really* have to have a deep personal reason for it....
Also you might want to define what you mean here by "everything" -- it's quite a generalization....
I made a graphic recently that puts humanistic-type activities on an extended continuum across from technological-type things -- it may be helpful here....
Humanities-Technology Chart 2.0
http://i47.tinypic.com/j9269k.jpg
Quail
25th December 2009, 22:57
Hopefully *not*. This is the area in which there should be the *least* amount of controversy over the adoption of machinery -- in eliminating *gruntwork*, even for the relatively higher-level function of fixing *existing* machinery -- why can't we just get another machine to do that?
What happens when that machine malfunctions? Surely there would always have to be some level of human input to check that nothing had gone wrong with the machines? Otherwise we're just trusting a bunch of machines to check up on each other, and if, for whatever reason, they aren't working quite right then things will go wrong. If, for generations, we have trusted machines to do the job, would we have people trained in the correct areas to sort things out? I don't think you can fully eliminate mundane work, even if you do limit it as much as possible.
There's *nothing* exciting, challenging, or adventurous about doing something that could easily be done by a machine -- one would *really* have to have a deep personal reason for it....
Also you might want to define what you mean here by "everything" -- it's quite a generalization....
I don't think I ever said that we should still be doing things that we don't want to do that can be done easily by machines. That's why we invent machines.
By everything I was imagining a hypothetical world where all your daily needs such as cooking and cleaning, and all of your daily tasks such as shopping were done for you. Relying too much on technology to do these things for you could be quite disasterous if there was, for example, a power outage. It doesn't seem desirable to live like that because we could lose our independence and ability to cope with difficulties.
I would also personally not feel comfortable entrusting everything to a machine. Machines are not infallible, and if I didn't want to learn about them in detail, if they did screw up I wouldn't be able to do anything about it, since I assume we wouldn't make lessons in engineering and computing compulsory?
As I said above, I also don't think it is plausible to completely eliminate work with machines. We have to program them and develop them (although i appreciate that there are people who would really enjoy developping and designing them) and it seems unrealistic to create an entire heirarchy of machine-slaves that all make sure that their fellow machines are working. If some of them malfunction and there's nobody there to spot that and fix them, the entire system of machines serving us will not work properly.
You could ask, "Why can't we design a machine to do that?"
But then if we designed a machine to do that, then where would the "chain" (for want of a better word) of machines end?
ckaihatsu
26th December 2009, 12:56
What happens when that machine malfunctions? Surely there would always have to be some level of human input to check that nothing had gone wrong with the machines? Otherwise we're just trusting a bunch of machines to check up on each other, and if, for whatever reason, they aren't working quite right then things will go wrong. If, for generations, we have trusted machines to do the job, would we have people trained in the correct areas to sort things out? I don't think you can fully eliminate mundane work, even if you do limit it as much as possible.
Considering that machines were developed by *human* intelligence and are in existence for *human* reasons / purposes, then, sure -- *of course* they're circumscribed to the sphere of our intentions and motivations.
If you don't mind my saying so, it's for this reason that I don't think the term 'trust' here is the best term to use. In constructing the world around us we have no other types of sentient life to blame outside of humanity -- this means that, in using tools from screwdrivers to computers, there is no "trusting" of any-*thing*. Either we have constructed the pathways of mechanical cause-and-effect linkages exactly in the manner of our intentions, or we haven't. Either we've taken *all* atmospheric / environmental factors into account within the way that we want our machines to operate, or we haven't. And those two considerations cover *everything* that has to do with mechanical operations.
In terms of *reducing* the mundane routines of tending to machinery, we could always use "chunking" strategies, the same way that our minds "chunk" experiential and conceptual memories up into higher-level concepts for easier reference and access. The muscles in our bodies also learn and can execute higher-level, more-sophisticated routines on their own -- once we've trained them with repetition -- with the slightest mental command.
So, likewise, we may want to simply *arrange* our *mechanical* mechanisms -- as extensions of our *own* nervous / awareness systems -- so as to make them as self-diagnosing and self-maintaining as possible. Again, this would have nothing to do with "trust", which implies a *social bond* among *people* -- it has everything to do with good design and engineering.
I don't think I ever said that we should still be doing things that we don't want to do that can be done easily by machines. That's why we invent machines.
By everything I was imagining a hypothetical world where all your daily needs such as cooking and cleaning, and all of your daily tasks such as shopping were done for you. Relying too much on technology to do these things for you could be quite disasterous if there was, for example, a power outage. It doesn't seem desirable to live like that because we could lose our independence and ability to cope with difficulties.
The *escalator* is an *excellent* example of something mechanical that has a *fallback* to something *manual* -- instead of climbing up flight after flight of stairs ourselves we can have escalators ferry us up for floor after floor. But -- in the event that an escalator may *break down*, it simply reverts back to *staircase* functionality by default.
*You* don't have to have *any* machines do your cooking and cleaning for you, but *others* *might* want that kind of automatic service, and *should* have access to such functionality. The mechanics of providing such service shouldn't be cumbersome, or energy intensive, or complicated, or prone to malfunction -- just as with any other tool we may come to use regularly. And, of course, there should be a manual fallback so that we're not *prevented* from fulfilling these routines ourselves by any machinery that happens to break down.
I would also personally not feel comfortable entrusting everything to a machine. Machines are not infallible, and if I didn't want to learn about them in detail, if they did screw up I wouldn't be able to do anything about it, since I assume we wouldn't make lessons in engineering and computing compulsory?
Well thanks to mass production techniques no one *has* to "look under the hood" if they don't want to -- consumer computer technology, including all of the pathways of the Internet, has become *very* useful, reliable, automated, and user-friendly. As long as *some* people have an interest in developing these technical means then the rest of us are free to benefit, especially with infinitely-replicable digital-based media and software. *Plenty* of previously extensive, complicated office-information-administration-type functions have now been made effortlessly available to the average computer user, thanks to the computing power and information networks now commonly available, not to mention more sophisticated audio-video applications and so on....
As I said above, I also don't think it is plausible to completely eliminate work with machines. We have to program them and develop them (although i appreciate that there are people who would really enjoy developping and designing them) and it seems unrealistic to create an entire heirarchy of machine-slaves that all make sure that their fellow machines are working. If some of them malfunction and there's nobody there to spot that and fix them, the entire system of machines serving us will not work properly.
You could ask, "Why can't we design a machine to do that?"
But then if we designed a machine to do that, then where would the "chain" (for want of a better word) of machines end?
As I indicated above, we could *arrange* the *configuration* of lower-level components so that it's *easier* to "keep an eye" on all of them. Consider computer memory addresses -- "RAM", or random-access memory. A typical computer these days can access (read from and write to) *billions* of individual memory addresses, each containing a *different*, unique bit of data.
While computer memory is mostly reliable, if there *is* a problem that pops up, the computer's operating software has the means to determine *which* memory addresses are being problematic. In this way *we're* not *manually* looking over *billions* of memory addresses *ourselves* -- it's being done *for us* by a differently functioning part of the computer's operating system, and can readily report problems *to* us, in which case we'd probably just *replace* the malfunctioning part altogether with an identical, mass-produced replacement part.
In this way we'd *always* be "at the top" of the mechanical hierarchy, merely going about our lives the way we'd like to, and interrupted by a mechanical diagnostic function *only* if a problem pops up, which we would then have to turn our attention to.
Or -- with better-designed *arrangements* of mechanical linkages we might even be able to *not* be interrupted for most mundane, foreseeable problems that pop up. Companies and institutions, for example, commonly have *automated backup routines* for their data archives so that even if a mechanical failure occurs, the all-important data itself still remains intact elsewhere, and is automatically recovered without the need for user attention or intervention.
8bit
27th December 2009, 05:17
I think the image of Richard Stallman, who is responsible for leading the development of the GNU software suite, the GNU General Public License, (an open license for computer software which encourages sharing of work and information) and the Free Software Foundation, and is a lifetime advocate of free software, as my avatar alongside my custom title of 'techno-marxist' displays my position on technology pretty obviously...
Technology is always originally only benificial to the rulling class, but just as the technologies developed by the aristocracy for the purposes of expanding empires and competing with other empires eventually allowed merchants to safely travel far from castle walls leading to the bourgeois revolution, the technologies developed by the bourgeois to compete amongst themselves and supress the proletariat will only lead to a larger, more intelectual, more educated, and more class aware proletarian class.
In fact, I believe that the Internet will be a defining factor in the transition to socialism, as it will cause the above. The development of FLOSS will only increase the ability for the prolitariat to gain access to information, thus, ultimately gaining class awareness.
Quail
27th December 2009, 16:05
Considering that machines were developed by *human* intelligence and are in existence for *human* reasons / purposes, then, sure -- *of course* they're circumscribed to the sphere of our intentions and motivations.
If you don't mind my saying so, it's for this reason that I don't think the term 'trust' here is the best term to use. In constructing the world around us we have no other types of sentient life to blame outside of humanity -- this means that, in using tools from screwdrivers to computers, there is no "trusting" of any-*thing*. Either we have constructed the pathways of mechanical cause-and-effect linkages exactly in the manner of our intentions, or we haven't. Either we've taken *all* atmospheric / environmental factors into account within the way that we want our machines to operate, or we haven't. And those two considerations cover *everything* that has to do with mechanical operations.
In terms of *reducing* the mundane routines of tending to machinery, we could always use "chunking" strategies, the same way that our minds "chunk" experiential and conceptual memories up into higher-level concepts for easier reference and access. The muscles in our bodies also learn and can execute higher-level, more-sophisticated routines on their own -- once we've trained them with repetition -- with the slightest mental command.
So, likewise, we may want to simply *arrange* our *mechanical* mechanisms -- as extensions of our *own* nervous / awareness systems -- so as to make them as self-diagnosing and self-maintaining as possible. Again, this would have nothing to do with "trust", which implies a *social bond* among *people* -- it has everything to do with good design and engineering.
Perhaps "trust" wasn't quite the right choice of word, but in creating machines that can self-diagnose or diagnose each other and not checking them ourselves, we are actually "trusting" that the circuits or whatever are working properly. My point is just that at some level human input will be needed, even if it is just to turn the machines on/off or check a few dials to make sure that nothing has gone wrong that the machines themselves have failed to pick up, and the more "intelligent" the machine, the more mundane the input needed is likely to be.
The *escalator* is an *excellent* example of something mechanical that has a *fallback* to something *manual* -- instead of climbing up flight after flight of stairs ourselves we can have escalators ferry us up for floor after floor. But -- in the event that an escalator may *break down*, it simply reverts back to *staircase* functionality by default.
*You* don't have to have *any* machines do your cooking and cleaning for you, but *others* *might* want that kind of automatic service, and *should* have access to such functionality. The mechanics of providing such service shouldn't be cumbersome, or energy intensive, or complicated, or prone to malfunction -- just as with any other tool we may come to use regularly. And, of course, there should be a manual fallback so that we're not *prevented* from fulfilling these routines ourselves by any machinery that happens to break down.
An escalator is a good example of a machine that, if it breaks down, still functions, but imagine that you are in a lift that breaks down. If a lift breaks down with somebody in it, that person may need the fire services to help them escape, which isn't quite so simple.
I also never said that people shouldn't have access to machines I don't personally want. Not all machinery would necessarily have a manual fallback, and some manual ways of doing things may require knowledge that the person who usually depends wholly on their technology hasn't learned, or outdated items that they don't own.
Well thanks to mass production techniques no one *has* to "look under the hood" if they don't want to -- consumer computer technology, including all of the pathways of the Internet, has become *very* useful, reliable, automated, and user-friendly. As long as *some* people have an interest in developing these technical means then the rest of us are free to benefit, especially with infinitely-replicable digital-based media and software. *Plenty* of previously extensive, complicated office-information-administration-type functions have now been made effortlessly available to the average computer user, thanks to the computing power and information networks now commonly available, not to mention more sophisticated audio-video applications and so on....
The internet might be able to tell me what may be the problem, but since I don't understand how computers work too well, I might not be able to tell the difference between two similar problems, and I would certainly not be able to fix it myself, especially if I couldn't be certain of the exact problem.
Computers will have to become an awful lot more idiot-friendly before the average person can rely solely on internet resources and trouble-shooting to sort out their problems, which is difficult when they are also constantly becoming more capable of more complicated tasks.
If people don't understand how things work, they can't sort out any problems with it.
While computer memory is mostly reliable, if there *is* a problem that pops up, the computer's operating software has the means to determine *which* memory addresses are being problematic. In this way *we're* not *manually* looking over *billions* of memory addresses *ourselves* -- it's being done *for us* by a differently functioning part of the computer's operating system, and can readily report problems *to* us, in which case we'd probably just *replace* the malfunctioning part altogether with an identical, mass-produced replacement part.
So there would still have to be someone around to make sure that no problems had been reported and that no action need be taken.
Or -- with better-designed *arrangements* of mechanical linkages we might even be able to *not* be interrupted for most mundane, foreseeable problems that pop up. Companies and institutions, for example, commonly have *automated backup routines* for their data archives so that even if a mechanical failure occurs, the all-important data itself still remains intact elsewhere, and is automatically recovered without the need for user attention or intervention.
There's nothing wrong with having a back-up at all, but it would still be useful if someone was aware of such a failure, in case it was a symptom of something more problematic.
Off-topic, I'd just like to point out that I find the stars you put around words (I assume for emphasis) really distracting when I'm trying to read what you've written.
ckaihatsu
27th December 2009, 17:11
Perhaps "trust" wasn't quite the right choice of word, but in creating machines that can self-diagnose or diagnose each other and not checking them ourselves, we are actually "trusting" that the circuits or whatever are working properly.
Well, in *this* case that you're describing we're in our economic role as *consumers* -- we're *forced* to trust the products by default because there's no real alternative to the capitalist mode of production for computer equipment. Fortunately, as 8bit mentioned, we have free / open-source *software*, including entire operating systems (Linux), to choose as an option over *commercial* products.
My point is just that at some level human input will be needed, even if it is just to turn the machines on/off or check a few dials to make sure that nothing has gone wrong that the machines themselves have failed to pick up, and the more "intelligent" the machine, the more mundane the input needed is likely to be.
I have to respectfully disagree here -- I think you've adopted conventional / traditional, and even *fictional*, conceptualizations of how machinery is meant to serve us. Again, the machine -- like a fancy handmade toy -- is only as good as the *human* efforts that go into its creation.
Do you *want* a machine that *requires* you to turn it on and off on a regular basis, or one which only relays its output through the interface of a few dials? Perhaps you *have* to work at a clunky, crude machine like the one you're describing, in order to make a living. In *this* case it's more of a *labor* issue you're describing since food money doesn't grow on trees, and the food trees themselves have been declared private property and are fenced off.
But as a strictly *technical*, *user*-oriented issue, the machine itself -- or, more realistically, the *software* for a *computer* system -- can be built-up and *configured* any which way you like, as long as *you're* the user....
The machine is only as "intelligent" as its design, so if you don't want to be bothered with turning it on and off or checking dials then one should build / engineer a suitable system that perhaps has an automatic low-power "sleep" mode, and that *emails* status updates right to your cell phone.
An escalator is a good example of a machine that, if it breaks down, still functions, but imagine that you are in a lift that breaks down. If a lift breaks down with somebody in it, that person may need the fire services to help them escape, which isn't quite so simple.
Agreed. Many -- perhaps most -- common consumer machines built by the capitalist (profit-driven) mass production system are *not* meant to have fallback functionality or user-serviceable parts if they break down. And we all know the reasons for *that* particular kind of engineering....
I also never said that people shouldn't have access to machines I don't personally want. Not all machinery would necessarily have a manual fallback, and some manual ways of doing things may require knowledge that the person who usually depends wholly on their technology hasn't learned, or outdated items that they don't own.
I'd appreciate some examples, though I tend to think that I'm in agreement here.
The internet might be able to tell me what may be the problem, but since I don't understand how computers work too well, I might not be able to tell the difference between two similar problems, and I would certainly not be able to fix it myself, especially if I couldn't be certain of the exact problem.
Ideally things from large to small should just have *replaceable parts* so that actual * fixing * (labor for maintenance) doesn't have to be imposed on *whatsoever*. It's due to the capitalist system that the wages for some semi-skilled labor to *fix* things will often be cheaper than just getting a %^&$#@! replacement part for the machine in question.
Computers will have to become an awful lot more idiot-friendly before the average person can rely solely on internet resources and trouble-shooting to sort out their problems, which is difficult when they are also constantly becoming more capable of more complicated tasks.
If people don't understand how things work, they can't sort out any problems with it.
I think that the number of software features built into browsers (for accessing the web) has expanded greatly -- this means that there *is* more to be cognizant of, especially if one wants to *modify*, *customize*, or *configure* these features -- there *are* alternatives available for browsing the web if one wants to skip the fancier features altogether....
The other thing, too, is that as regular computers have exploded in speed and ability the users' scope of tasks done in the digital environment has increased in proportion. This means that many people are *trying* more complicated and sophisticated things, which, in turn, require more understanding and prowess over numerous, interacting functions. This would be true of learning *any* new tasks, though in the realm of software things are certainly *much more* intricate than most people are used to experiencing otherwise.
While computer memory is mostly reliable, if there *is* a problem that pops up, the computer's operating software has the means to determine *which* memory addresses are being problematic. In this way *we're* not *manually* looking over *billions* of memory addresses *ourselves* -- it's being done *for us* by a differently functioning part of the computer's operating system, and can readily report problems *to* us, in which case we'd probably just *replace* the malfunctioning part altogether with an identical, mass-produced replacement part.
So there would still have to be someone around to make sure that no problems had been reported and that no action need be taken.
Heh -- this is kind of funny to respond to, in our *current* state of technology -- are you "around"? -- are you *existing* -- ??? All you'd need *these* days is a text-messaging cell phone and your equipment can report problems / status updates to you wherever you'd like to be...(!)
There's nothing wrong with having a back-up at all, but it would still be useful if someone was aware of such a failure, in case it was a symptom of something more problematic.
Certainly.
Off-topic, I'd just like to point out that I find the stars you put around words (I assume for emphasis) really distracting when I'm trying to read what you've written.
Yeah, what can I say -- it's just how I happen to add *emphasis* rather than using italics -- it may be the *only* way in which I'm old-fashioned...!
smellincoffee
18th January 2010, 19:10
Something I notice while in factories is that increasing automation erodes the strength of workers. The only people I've ever known who could stand up to the managers were those who knew something about the work, knowledge only obtainable through years of experience but they were few in number. Skilled workers --- for whom technology is only a tool -- can union and defend themselves through that if they are exploited, but those workers who need no skills or real knowledge to do their jobs are powerless. In the last factory I worked in, the most skilled job available was forklift driver. Everyone else did labor that could be done by a trained chimpanzee -- dumping boxes of bottles onto a conveyor belt, for instance, or screwing caps to bottles as they moved by. If these workers try to stand up for themselves, they're perfectly expendable: the bosses can fire the lot of them and find plenty of other people to do the "work". What concerns me is that more and more jobs are becoming skill-less like this, and consequently workers will have increasingly less power.
And, as Marx noted, this over-simplification of work also has the effect of making the work miserable.
I don't know if anyone here has read any Neil Postman, but in Technopoly and again in Amusing Ourselves to Death, he speaks as a cultural conservationist, viewing the increasing human reliance on technology -- not as tool, but as lifeblood -- as a danger. According to him, every technology carries its own meaning and changes the culture to which it is introduced: "the medium is the message" is one phrase I remember from the book. He's very much a fan of print culture as opposed to television.
Sean
18th January 2010, 19:44
Something I notice while in factories is that increasing automation erodes the strength of workers. The only people I've ever known who could stand up to the managers were those who knew something about the work, knowledge only obtainable through years of experience but they were few in number. Skilled workers --- for whom technology is only a tool -- can union and defend themselves through that if they are exploited, but those workers who need no skills or real knowledge to do their jobs are powerless.
Yes, but that's been an issue since industry. They say there are 2 methods of job security, at least in white collar work.
One is to take your job and try to make it as arcane as possible making you indispensible. They other, to make your job as efficient and open as possible, giving you room to move up. Hell, you can't go anywhere if noone else can take you job, right?
"They" of course are cappie managers and you being the spare parts of the workforce.
When I was a little younger and hadn't any class consciousness (other than being a have-not hating the rich), I'd walk into someone elses position if they were off sick, decode their crappy ways and make it "simpler". It was just enthusiasm and flexing what I could do. Unless you can be a complete fucker to your fellow man, that behaviour doesn't put you where you think you rightfully belong, running the show, but rather gets you more work to do and opens the threat that there are plenty on the dole queue who could do it just as well.
Of course what I was unwittingly doing was destroying the obfuscation that someone else had built up in attempt to retain a position as a skilled instead of interchangeable worker.
I'm sure from machinists through to programmers, thats true (Just try and read some of the code I've done on a payroll without the rosetta stone handy).
Without knocking technocracy (not too hard) until we've achieved some kind of high level of universal scientific education - that it, understanding scientific methods and priniciples, not pseudo-science pulp garbage that pollutes the curious mind with fanciful nonscience, (I'm thinking those stupid ghost buster types on TV) - I'm all for luddites. It might sound like a massive tangent I'm off on, but until you have a superstitionless society, you can't have an automated one.
Wait until people are either not interchangable parts, or at least equal before building interchangable parts.
ckaihatsu
19th January 2010, 04:49
The topic of technology and technological progress / "progress" is a surprisingly tough issue for Marxism. It's the *only* issue that becomes difficult for Marxism, and not because of any shortcomings on the part of Marxism itself, but because of the nature of work and scientific progress under capitalist management / "leadership", going forward.
The contradictions between [1] labor and [2] capital, [3] worker and [4] consumer, and [5] human and [6] machine, really reach a fever pitch in the critical convergence of the workplace's manufacturing process.
Something I notice while in factories is that increasing automation erodes the strength of workers. The only people I've ever known who could stand up to the managers were those who knew something about the work, knowledge only obtainable through years of experience but they were few in number. Skilled workers --- for whom technology is only a tool -- can union and defend themselves through that if they are exploited, but those workers who need no skills or real knowledge to do their jobs are powerless.
The interests of workers in the present time-frame should take *precedence* over all else, but since social reality is not this accomodating we have a far more complicated reality to deal with under capitalism's regime.
The present social reality is further complicated by capitalism's artifical cleavage of the social roles of [3] worker and [4] consumer -- our interests as workers for collective control in the workplace over *existing* technological processes actually *objectively conflict* with our interests as our own counterparts as *consumers*. As consumers, or relatively privileged recipients of the outputs of the productive process, we have been *separated* from our halves who do the actual *work* to produce the stuff that we consume -- and [1] laborers, of course, are *separated* from the *means* of the mass production -- [2] capital -- by capitalism's regime of private property.
If these *six* social-role components could be *integrated* into a single *whole* social being then we wouldn't *have* to bother with all of the resulting complications, including respective types of alienation, that we routinely experience as a direct result of this productive-and-consumptive apartheid.
Instead, we're directed against *ourselves* in artificially structured conflicts of [1] commodifying ourselves as workers for the productive process (labor) versus [2] participating in doing the same to others if any labor-earned surplus is to be made productive in the markets (capital), [3] Luddite (cooperative workers' rule) versus [4] techno-visionary (enabled, individuated consumer), and [5] sentient creatures of the earth (human) versus [6] tool-makers and -users (machine).
And, as Marx noted, this over-simplification of work also has the effect of making the work miserable.
those workers who need no skills or real knowledge to do their jobs are powerless.
Worse yet, for those parts of the world's economy that can't stay competitive -- by the rules of the market (hyper-exploitation of labor) -- to other parts of the economy, the entire *manufacturing* basis is pulled out from under labor entirely, leaving it to a feudal-like, master-and-servant relationship called "the service sector".
Of course what I was unwittingly doing was destroying the obfuscation that someone else had built up in attempt to retain a position as a skilled instead of interchangeable worker.
Given the profound balkanization of labor solidarity under these conditions of "the service sector" it's no wonder that the more-skilled workers would want to create as much personalized "specialization" around their own job positions as possible, no matter what the consequences might be for consumers and for scientific / technological progress in general, such as it may be (under capitalism).
I'm all for luddites. It might sound like a massive tangent I'm off on, but until you have a superstitionless society, you can't have an automated one.
Wait until people are either not interchangable parts, or at least equal before building interchangable parts.
I don't know if anyone here has read any Neil Postman, but in Technopoly and again in Amusing Ourselves to Death, he speaks as a cultural conservationist, viewing the increasing human reliance on technology -- not as tool, but as lifeblood -- as a danger.
This contradiction between objectively needing labor solidarity and collective worker control over society's technology, and our objective, tool-using human reliance on technology, wreaks all kinds of problems for us in Marxist practice.
Certainly we continue to favor labor solidarity and collective workers' control, but this will inevitably come into conflict with our roles as technology-empowered and -favoring consumers who benefit from the automation of work roles (thus displacing workers).
until you have a superstitionless society, you can't have an automated one.
Without knocking technocracy (not too hard) until we've achieved some kind of high level of universal scientific education - that it, understanding scientific methods and priniciples, not pseudo-science pulp garbage that pollutes the curious mind with fanciful nonscience, (I'm thinking those stupid ghost buster types on TV) -
The scientific method is *very* simple and well-known, but that's not even the *point* these days -- more to the point is *what social forces* are given the reins to direct future scientific pathfinding.
In the twentieth century, a hypothetico-deductive model for scientific method was formulated (for a more formal discussion, see below):
1. Use your experience: Consider the problem and try to make sense of it. Look for previous explanations. If this is a new problem to you, then move to step 2.
2. Form a conjecture: When nothing else is yet known, try to state an explanation, to someone else, or to your notebook.
3. Deduce a prediction from that explanation: If you assume 2 is true, what consequences follow?
4. Test: Look for the opposite of each consequence in order to disprove 2. It is a logical error to seek 3 directly as proof of 2. This error is called affirming the consequent.[7]
This model underlies the scientific revolution.
http://en.wikipedia.org/wiki/Scientific_method
Until *workers* are the ones to control the means of *scientific* / *technological* progress and production in a collective-interest, rational kind of way, we're left to the capitalist markets' "invisible hand" being the one gripping the reins, for both consumers' benefit and the world's destruction in high-tech militaristic world wars.
According to [Postman], every technology carries its own meaning and changes the culture to which it is introduced: "the medium is the message" is one phrase I remember from the book. He's very much a fan of print culture as opposed to television.
Herbert Marshall McLuhan, CC (July 21, 1911 – December 31, 1980) was a Canadian educator, philosopher, and scholar — a professor of English literature, a literary critic, a rhetorician, and a communication theorist. McLuhan's work is viewed as one of the cornerstones of the study of media theory.
McLuhan is known for the expressions "the medium is the message" and "global village". McLuhan was a fixture in media discourse from the late 1960s to his death and he continues to be an influential and controversial figure. More than ten years after his death he was named the "patron saint" of Wired magazine.
http://en.wikipedia.org/wiki/Hot_media
(Just for the record, this deification of McLuhan's "the medium is the message" is bullshit and drives me up the fucking wall. It's *not* the medium itself, it's the *content programming* for either -- both print and broadcast media can be used for socially progressive or socially reactionary purposes, depending on the balance of class forces.)(It just so happens that in the past, avenues of print communication were much more available to workers, in their role as consumers, than the monopolized medium of video broadcast was. With the mainstreaming of the *digital* medium, the Internet, the production of video-based content has now become commonly available to workers, as consumers, thereby undercutting the conventional broadcast television media monopoly in the "hot" medium of video.)
Chris
--
--
--
___
RevLeft.com -- Home of the Revolutionary Left
www.revleft.com/vb/member.php?u=16162
Photoillustrations, Political Diagrams by Chris Kaihatsu
community.webshots.com/user/ckaihatsu/
3D Design Communications - Let Your Design Do Your Footwork
ckaihatsu.elance.com
MySpace:
myspace.com/ckaihatsu
CouchSurfing:
tinyurl.com/yoh74u
-- Taking jadedness far beyond the gemstone industry --
cska
22nd January 2010, 14:12
Do us technophobes count as Luddites? :lol: Im of the believe that the simpler something is the better it is - less moving parts, user friendlyness etc, the basic Kalashnikov docterine. When someone talks about Random access Memory, CPU's and all other associated words my mind goes to sleep.
Nope. I like new technology but hate how it just becomes complicated. Technology should make our lives simpler, not harder. I found this interface design team that has the right philosophy: http://humanized.com/about/
cska
22nd January 2010, 14:22
Yes, but that's been an issue since industry. They say there are 2 methods of job security, at least in white collar work.
One is to take your job and try to make it as arcane as possible making you indispensible. They other, to make your job as efficient and open as possible, giving you room to move up. Hell, you can't go anywhere if noone else can take you job, right?
"They" of course are cappie managers and you being the spare parts of the workforce.
When I was a little younger and hadn't any class consciousness (other than being a have-not hating the rich), I'd walk into someone elses position if they were off sick, decode their crappy ways and make it "simpler". It was just enthusiasm and flexing what I could do. Unless you can be a complete fucker to your fellow man, that behaviour doesn't put you where you think you rightfully belong, running the show, but rather gets you more work to do and opens the threat that there are plenty on the dole queue who could do it just as well.
Of course what I was unwittingly doing was destroying the obfuscation that someone else had built up in attempt to retain a position as a skilled instead of interchangeable worker.
I'm sure from machinists through to programmers, thats true (Just try and read some of the code I've done on a payroll without the rosetta stone handy).
Without knocking technocracy (not too hard) until we've achieved some kind of high level of universal scientific education - that it, understanding scientific methods and priniciples, not pseudo-science pulp garbage that pollutes the curious mind with fanciful nonscience, (I'm thinking those stupid ghost buster types on TV) - I'm all for luddites. It might sound like a massive tangent I'm off on, but until you have a superstitionless society, you can't have an automated one.
Wait until people are either not interchangable parts, or at least equal before building interchangable parts.
And you think getting rid of technology is going to make people more likely to believe real science rather than pseudo-science? Besides, how does pseudo-science adversely affect our use of technology? No, the answer is for a communist society to stress scientific education, while keeping technology.
Powered by vBulletin® Version 4.2.5 Copyright © 2020 vBulletin Solutions Inc. All rights reserved.