Log in

View Full Version : Technological Singularity



Redistribute the Rep
3rd January 2014, 05:14
The technological singularity, or simply the singularity, is a hypothetical moment in time when artificial intelligence will have progressed to the point of a greater-than-human intelligence, radically changing civilization, and perhaps human nature.[1] Since the capabilities of such an intelligence may be difficult for a human to comprehend, the technological singularity is often seen as an occurrence (akin to a gravitational singularity) beyond which the future course of human history is unpredictable or even unfathomable.


I've heard estimates of us reaching this point as early as 2045

Any ideas on how this will affect capitalism and future socialist efforts??

Sperm-Doll Setsuna
3rd January 2014, 09:12
Pseudo-religious hogwash from technofetishists, is what the idea of this singularity is. A global revolution would be equally a point beyond which things are unpredictable... But they apparently have a longing for faith in something and this nebulous singularity serves as some sort of substitution for a religious day of reckoning.

Sabot Cat
3rd January 2014, 09:41
The singularity is essentially projecting out advancements in technology until the societal implications are kind of hard to account for, or in syllogism form:

(1) Computers have been getting progressively faster processing speed/power/etc.
(2) They've been catching up to the capacity of human brains.
(3) At this (2045/2055/etc.) point or around it, the average computer would have the horsepower of numerous people's brains at once, and if we have artificial intelligence at least smart as someone would be with that capability if they were human, who knows what could happen!

That's kind of lacking the nuance of their actual predictions, but these computer scientists are hoping that we will develop hard AI, and that hard AI will eventually be intelligent enough to make its own scientific contributions unparalleled by any human scientist, and bootstrap itself to make improvements on its own code. This is speculative, but I wouldn't dismiss it out of hand. We have seen sure steps towards more "intelligent" machines, and although the press and science fiction authors have often jumped the gun in anticipation of these kinds of technological developments, it's only a matter of time before they're created, even if it's not going to conveniently fall into our laps.

tallguy
3rd January 2014, 10:49
Pseudo-religious hogwash from technofetishists, is what the idea of this singularity is. A global revolution would be equally a point beyond which things are unpredictable... But they apparently have a longing for faith in something and this nebulous singularity serves as some sort of substitution for a religious day of reckoning.Yep. completely agree. It's hogwash.

ÑóẊîöʼn
3rd January 2014, 21:08
There's a lot of hype and misunderstanding surrounding the concept, but I think it's a supremely grave mistake to dismiss it as "Pseudo-religious hogwash from technofetishists", and not just because some very smart and/or rich people are taking it seriously.

Quoting from a tangentially-related document: Mini-FAQ on Artificial Intelligence (http://bbs.stardestroyer.net/viewtopic.php?f=49&t=136633)


I am the technical director of a small start-up which has been developing a fairly revolutionary automated software engineering system. Six years of R&D and we're getting close to market, but still not there yet. Fortunately we got a research grant for the first two years, so I was able to focus on fairly open-ended general AI research to start with, building on earlier work I'd done. Later it progressively focused down on something that could be a good first product; we've done a wide range of software consulting to fund development of that, about half of it AI-focused. I have a plan to shoot for general AI later, but we'll need more funding and quality staff to have any realistic chance of pulling it off.

Further back, I was a research associate at the Singularity Institute for AI for a while, late 2004 to late 2005ish, I'm not involved with them at present but I wish them well. I got started in AI doing game AI and making lots of simple prototypes (of narrow AI and very naive general AI concepts) as a teenager, and I took all the AI and psychology modules I could during my CompSci degree.

...

Secondly AGIs are pretty much by default vastly more intelligent than humans. Even moderately futuristic hardware will pack a lot more ops/second and ops/watt than a human brain, and really good AI code (which is what you'll have after a few centuries of AIs designing AIs at the latest) is /much/ more efficient at actually harnessing that power (thousands to billions of times more efficient depending on the problem). If you look at technology going to physical limits, then an AI based on a nanotech processor the size of a human brain is likely to have millions of times the effective compute power and storage and billions to trillions of times the effective problem solving power.

This is incidentally why the Singularity is a big deal in the first place - a combination of how much the human brain sucks compared to what technology can do and how quickly AI code and hardware is likely to improve once the first general AI is built. All those idiots messing around with 'rate of technological change' graphs are utterly missing the point, and even general nanoassemblers are essentially a sideshow.

The long and short of it is, Artificial General Intelligences are a massive game-changer and there is no indication that they are impossible, and many reasons to believe that they can be constructed.

CrimsonSerpent
4th January 2014, 23:04
I kind of have faith that if an intelligent life form develops from technology that it might understand dialectical-materialism better than most adult humans. Which would allow it to see how the world moves through capitalism and possibly aid in advancing revolution and socialism. Although I think it would take a much longer time for that type of life form to become self-aware.

BIXX
5th January 2014, 09:43
If the singularity happens, life will go on. Or maybe it won't. I dunno.

Really, it doesn't seem worth my time and brain power right now. Later it might be, if we can get a look at what the singularity might look like, but until then it is us nerds guessing about the Robopocalypse (which I am not averse to, I just feel we should recognize it for what it is).

Jimmie Higgins
5th January 2014, 10:02
There's a lot of hype and misunderstanding surrounding the concept, but I think it's a supremely grave mistake to dismiss it as "Pseudo-religious hogwash from technofetishists", and not just because some very smart and/or rich people are taking it seriously.i agree it shouldn't be dismissed but there is also quite a bit of techno-rapture mixed in with a lot of the speculation: ideas that it would solve the problems of capitalism and overcome it or that it would perfect it, machines taking over investments and moving capital faster that humans can comprehend.

So considering the industrial advances made by capitalism (which must have felt a little like a singularity for people born at the end of the nineteenth century) I don't think a future rapid leap in tech development is out of the question at all. But if it happens under capitalism, the development will be uneven, highly contradictory and the benefits going to the top while the rest of us have to conform to the new needs of a rapidly changing world.

ckaihatsu
5th January 2014, 19:55
This topic has had several prior threads and I remain amazed that comrades who wouldn't hesitate for a *second* to impose their collective class will to stop the runaway mechanism of *markets* suddenly look passive and helpless over the question of runaway *technology*.

While the understandable anxiety might be that of possibly looking like Luddites, we should primarily never forget that *all* machinery is the offspring of human intentions, and that we *do* already have a collective oversight of sorts in existing social norms -- there are social taboos over *many* technologies and materials already, and I don't think controversial developments would just be allowed to baby-step their way past general societal notice.

It takes a *human* act of will to 'let the reins go' and allow something suspect like having "computers program computers" -- so maybe this kind of step is something that should be politicized and publicized well-in-advance as a general no-no, instead of just adopting a fatalistic 'oh-well' mindset by default.

I'll maintain that *anyone* who sees the computer-overlord scenario as an inevitability is actually taking a *political position* for that due to their blasé attitude.

Remus Bleys
5th January 2014, 19:59
words
its not that it is a concept we fear or whatever but its a concept that is bullshit.
how does "Pseudo-religious hogwash from technofetishists, is what the idea of this singularity is. A global revolution would be equally a point beyond which things are unpredictable... But they apparently have a longing for faith in something and this nebulous singularity serves as some sort of substitution for a religious day of reckoning." sound like "oh no i am afraid of technology!"?

Sabot Cat
5th January 2014, 21:27
I think comparing the prediction of an improbable but still possible event which brings hope to someone who nonetheless has a materialist framework to the Christian rapture is a poor means of argumentation. Many people also view worldwide proletarian revolution as a Marxist eschatology, for instance, and find it satisfactory to denounce it in rhetoric than by reason. If you want to refute the idea of technological singularity, you have to do more than draw parallels to religion.

You could ask questions like: Even if we are able to achieve computers with the processing speed and capabilities of a human brain, how could we develop it to be smarter than us? Wouldn't it make more sense if we could only build a machine as smart?

I would answer that if we can create an artificial intelligence with a level of cognitive capability matched not just by the average person, but an especially smart one, then we could mass produce such a mind, and the likelihood of intellectual achievements akin to those made by scientific luminaries drastically increases, especially because an AI would have the benefit of accessing the vast repository of digitally stored knowledge.

The only significant obstacle for the above is creating a computer that can emulate the capabilities of a human brain. There is nothing clearly insurmountable in figuring out how our brains work on a conceptual, informational level, and we are making progress by leaps and bounds in both computational technologies and neurology. Furthermore, I think we can at least emulate something through conscious effort that was slapped together by subtle, unthinking processes.

The technological singularity is a scientific hypothesis that relies upon the premise that we can eventually give computers the capabilities of incredibly intelligent people. Although many speak with reverence about the possibly extraordinary implications of such a feat, one should not think that it is thus pseudoscience. It is simply not in our hands yet.

Jimmie Higgins
6th January 2014, 09:44
Rapture or revolution? Personally my criticism is not about if rapid tech advancement is possible nor desireable, but with the conclusions some make thinking that tech will "solve" or "perfect" issues of capitalism. One of the biggest proponents of singularity write a whole book that's mostly about how computers will make better financial investment decisions than humans because they will understand complex futures synerios and so on.

Ludditism is a good parallel. For the capitalists and professionals, industrialization was rationalizing, organizing, saving labor time, standardizing labor efforts etc. who but backwards cavemen could be against that? Well the artisans being prolitarianized and the prols being time-disciplined by machines, made appendages of machines, did not feel that these tech advancements advanced their own lives. Ludditeism is a dead end, but the fears are legitimate class fears about loosing what little control people have over their own lives and conditions. Fears of newer tech are no different. In the u.s. Where basic health care is a hassle and people are already economically prohibited from known and even simple health treatments, fear of "runaway" genetics or health inequalities that expand the life for the rich are not primitives cowering before fire.

Our lives are run by irrational (from a neutral human standpoint) and alien systems... Fear of technological advances creating wider gaps reflect class anxieties over loss of control and devaluation of labor. Workers tend to fear tech controlling them, robots taking over assimilating us or tearing us apart, whereas the ruling class tends to fear technology gaining autonomy, Frankenstein.

At any rate a singularity would not change capitalism as much as the rules of capitalism would inform and shape the course of tech advancement and that's my main disagreement with the tech Utopians.

ckaihatsu
6th January 2014, 17:22
I mean to impress upon everyone that the 'purely autonomous, purely individualistic' paradigm for the proposed technology is a *misconception* to begin with -- maybe it would be possible if inorganic matter somehow managed to self-assemble and grow and evolve the way *organic* matter has on the earth, but that's not the case. No matter how complex and powerful technology can become it's not going to be possible to give it a 'blank slate', intention-wise, because technology is, by definition, the product of *human* intentions.

I don't think *anyone* here would have a problem with *politicizing* any potential tech issue, and calling for individuals to be held accountable for whatever complications arise, the way we would with *any other* societal-type issue.

*This* is the way we should conceptualize all of this, not as some future 'metal baby' that grows to usurp its human parents and then all of humanity.

ÑóẊîöʼn
7th January 2014, 01:10
At any rate a singularity would not change capitalism as much as the rules of capitalism would inform and shape the course of tech advancement and that's my main disagreement with the tech Utopians.

If "the rules of capitalism inform and shape the course of tech advancement", then why are so many capitalists so intent on replacing their paying customers with non-paying robot drones?

Actually, they're not doing that intentionally, but it is in fact a(n unintended?) side-effect of their never-ending quest to keep up the rate of profit. For example, Amazon are dabbling in replacing their workers with drones (http://www.bbc.co.uk/news/technology-25180906).

The nature of decision-making under capitalism punishes long-term planning while rewarding short-term profiteering. This would seem to include perpetuating the vicious cycle of automation, in which capitalists accelerate automation in order to keep up profitability, but in doing so destroy the economic power of their customers (because their jobs are now done by machines), which in turn calls for yet more automation and so on...

Jimmie Higgins
7th January 2014, 10:01
If "the rules of capitalism inform and shape the course of tech advancement", then why are so many capitalists so intent on replacing their paying customers with non-paying robot drones?
Because of precisely what you said: competition between firms favors the short term quick return and so saving labor, increasing the rate of exploitation, is needed for that competition. It's always a problem for us, but it doesn't become a problem for capitalists until they find their tendencies have eroded their general foundation.

Maybe I missed your point but I don't see how a tech singularity would be outside of these considerations and not develop based on increasing exchange and exploitation.

ÑóẊîöʼn
7th January 2014, 21:20
My point is that such contradictions are more than sufficient to produce unintended phenomena, and it's hard to deal with such side-effects in a truly effective manner when the temporal range of action barely extends beyond three months, or however long it takes for majority shareholders/stakeholders to get pissed off (and any correction that is undertaken is highly likely to be actively undermined). In such a desperately short-sighted pursuit of the almighty moolah, it hardly seems beyond the pale to suppose that the capitalists will make serious mistakes even if they don't end up shooting themselves in the foot because they thought it would make them money.

And this is before we even get to the creation of the first AGIs, which potential Singularity aside, would have a tectonic impact. The prospect of losing one's job to a machine would become a reality for a great many more workers.

So no, I don't agree that a technological Singularity "wouldn't change capitalism", in fact the above is part of the reason I believe that the technological conditions required to bring one about would be hardly conducive, if not outright hostile, to the continued survival of capitalism as a global system we'd recognise today.

TheSocialistMetalhead
7th January 2014, 21:38
Everybody's talking about technological singularity and I'm just sitting here waiting for the first quantum computer...

On a more serious note. I agree that it's a bit shortsighted to just dismiss it as a concept that is to be confined to the realms of science-fiction. There are some very intelligent people out there who make very good points in the idea's defense. However, what we should realise is that in recent history, a lot of people have got a little overzealous with their predictions. "We're gonna run out of X in year Y", "We'll have a moon base in year Z" and yes " We will have robots who can take over all our jobs by year Q".

The point is, because of the huge technological leaps we've seen in the past two centuries, we've become a little too confident in the human drive for progress and our intelligence. It wouldn't surprise me if we reached this level of technology one day but It's equally probable that I won't be around to see it.

Jimmie Higgins
8th January 2014, 11:27
So no, I don't agree that a technological Singularity "wouldn't change capitalism", in fact the above is part of the reason I believe that the technological conditions required to bring one about would be hardly conducive, if not outright hostile, to the continued survival of capitalism as a global system we'd recognise today.
Well maybe "wouldn't change capitalism" is the wrong way to put it since I do think capitalism is always changing and adapting. But I do think that tech by itself can not fundamentally change things in regards to class rule and exploitation. Industry and digital eras of advancement did change capitalism, while not altering the foundations and basic drives and logic. Technology is the application of scientific knowledge and so the context of that tech makes all the difference in how things are implemented and to what ends. Knowledge of steam engines in classical Rome did not create an industrial revolution because saving labor had no added incentive to a slave society.

It involves some basic contradictions of the system: the drive to save labor time while increasing labor performed from each worker; the drive to innovate, but that innovation being held back by the parameters of the system.

Could new advances create new condition or reorganize capitalism in ways that create new struggles or new class dynamics... Certainly and it happens frequently. But controlling the means of production also means having general influence over tech initiatives and development and we can see this with digital tech where information property has only become more controlled as it becomes essentially self-replicating with little effort.

ÑóẊîöʼn
8th January 2014, 20:11
Well maybe "wouldn't change capitalism" is the wrong way to put it since I do think capitalism is always changing and adapting. But I do think that tech by itself can not fundamentally change things in regards to class rule and exploitation. Industry and digital eras of advancement did change capitalism, while not altering the foundations and basic drives and logic.

They've not been around very long. Humans have barely begun to explore their possibilities.


Technology is the application of scientific knowledge and so the context of that tech makes all the difference in how things are implemented and to what ends. Knowledge of steam engines in classical Rome did not create an industrial revolution because saving labor had no added incentive to a slave society.

Making proper use of steam power also requires that precision engineering be somewhat widespread (or the potentiality of being made widespread has to be there), otherwise you get steam engines that leak steam and are very inefficient. The fact that thus far we have found only one Antikythera mechanism would seem to bear this out. There is also the question of whether Roman metallurgy would have been up to the task of producing the appropriate alloys, and a lack of knowledge concerning the Bessemer process (http://en.wikipedia.org/wiki/Bessemer_process) would have been a major obstacle.

In fact, it would not surprise if some degree of knowledge of these obstacles on the part of Roman engineers in fact constituted part of the disincentive in Roman society against steam power. Even today engineers are taught to apply what is known rather than push the boundaries like scientists, and thus it doesn't seem unreasonable to surmise that Roman engineers saw the problems as insurmountable. Plus they were plenty of slaves available to state and private interests, which would serve to put a significant damper on any research.


It involves some basic contradictions of the system: the drive to save labor time while increasing labor performed from each worker; the drive to innovate, but that innovation being held back by the parameters of the system.

I think that the potential importance of technological research and development has been a lot more ingrained into the general mindset of today than it was in Roman or even early industrial times, when fewer people had heard of anything like the "scientific method".


Could new advances create new condition or reorganize capitalism in ways that create new struggles or new class dynamics... Certainly and it happens frequently. But controlling the means of production also means having general influence over tech initiatives and development and we can see this with digital tech where information property has only become more controlled as it becomes essentially self-replicating with little effort.

Information property is more controlled?! It's now easier than ever to pirate music, films, and books then disseminate the results across the entire world. Snowden's spilling of the beans could not have been done in the same way and to the same extent in a world devoid of electronic communications. The very ease of replication is what makes copying of information so hard to control, and its not for a lack of trying on the part of media capitalists and the state.

State and capital can control the direction of research, in broad terms at least, but that does not mean that they can divine each and every unintended consequence of whatever research path they decide to pursue. Digital media is a good example of this - its throwaway nature increased profitability (easily damaged tapes and CDs would need to be bought again and again), but as it turned out that same nature also meant that the private individual became better placed to make copies of their own. So now it is entirely possible to acquire multiple media formats without having to pay a penny (apart from the cost of an internet connection).

ckaihatsu
8th January 2014, 23:04
Maybe we can address the original topic a bit more pointedly and speculate as to whether there *could* be a tipping point at which technological developments themselves could usurp class relations altogether -- I could see this as being possible if any given person had full individual control over agricultural and manufacturing processes that would effectively displace conventional capitalist processes, and thus the entire capitalist paradigm.

(Because if ownership could only continue to lay claim to parcels of land, and no longer be able to blackmail the world's population over food, energy, and materials, then that would be politically tantamount to a revolution.)


http://en.wikipedia.org/wiki/Claytronics


http://en.wikipedia.org/wiki/Grey_goo