The Singularity: a n00b question

  1. Red October
    Red October
    I'm new to technocracy and transhumanism, so I have some questions. What is "the singularity"? How do we "become the singularity"? And why is this desirable? Any help at all would be great.
  2. chimx
    chimx
    I would rather hear about the factual basis for this belief.
  3. Sentinel
    Sentinel
    See my post here. It contains some of my own beliefs on the issue, as well as the following excerpt of the WTA Transhumanist FAQ on the issue of the Singularity:

    Transhumanist FAQ

    2. Technologies and Projections

    2.7 What is the singularity?

    Some thinkers conjecture that there will be a point in the future when the rate of technological development becomes so rapid that the progress-curve becomes nearly vertical. Within a very brief time (months, days, or even just hours), the world might be transformed almost beyond recognition. This hypothetical point is referred to as the singularity. The most likely cause of a singularity would be the creation of some form of rapidly self-enhancing greater-than-human intelligence.
    The concept of the singularity is often associated with Vernor Vinge, who regards it as one of the more probable scenarios for the future. (Earlier intimations of the same idea can be found e.g. in John von Neumann, as paraphrased by Ulam 1958, and in I. J. Good 1965.) Provided that we manage to avoid destroying civilization, Vinge thinks that a singularity is likely to happen as a consequence of advances in artificial intelligence, large systems of networked computers, computer-human integration, or some other form of intelligence amplification. Enhancing intelligence will, in this scenario, at some point lead to a positive feedback loop: smarter systems can design systems that are even more intelligent, and can do so more swiftly than the original human designers. This positive feedback effect would be powerful enough to drive an intelligence explosion that could quickly lead to the emergence of a superintelligent system of surpassing abilities.
    The singularity-hypothesis is sometimes paired with the claim that it is impossible for us to predict what comes after the singularity. A post-singularity society might be so alien that we can know nothing about it. One exception might be the basic laws of physics, but even there it is sometimes suggested that there may be undiscovered laws (for instance, we don’t yet have an accepted theory of quantum gravity) or poorly understood consequences of known laws that could be exploited to enable things we would normally think of as physically impossible, such as creating traversable wormholes, spawning new “basement” universes, or traveling backward in time. However, unpredictability is logically distinct from abruptness of development and would need to be argued for separately.
    Transhumanists differ widely in the probability they assign to Vinge’s scenario. Almost all of those who do think that there will be a singularity believe it will happen in this century, and many think it is likely to happen within several decades.
    Basically, technological progress is constantly accelerating already. The Singularity is the point of climax of this acceleration, the point when technological progress becomes an unstoppable, fast as lighting process..

    I'm of the opinion that humans must strive to accomplish this gigantic technological upswing by themselves -- with the aid of technology, and by merging with technology.
    On the other hand, I dread a scenario in which technology could become the Singularity by itself, independently from human actions -- for example in the form of a powerful, self conscious AI.

    This is what I mean by wanting to become the Singularity.. I hope I cleared at least some things up, please ask more precisely if there's something else you don't understand. Also check the the transhumanist FAQ for the entire WTA description of the Singularity.
  4. chimx
    chimx
    the point when technological progress becomes an unstoppable, fast as lighting process..
    What I would like to understand is why people think this would occur. Especially considering production constraints.
  5. Dimentio
    What I would like to understand is why people think this would occur. Especially considering production constraints.
    It will happen sometimes, unless the entire system collapses before that. If we build a post-capitalist society which is successful in developing the productive forces, the human race will probably look a lot different in the 2150;s.
  6. chimx
    chimx
    That's not answer my question on the 'why'
  7. Dr Mindbender
    Reading sentinels post, i think the more likely cause of a singularity would be the intervention of a highly advanced ET lifeform, rather than a super-intelligent AI.
    I think there are too many conflicts of interest to allow cybernetic forms to become so intelligent.
  8. ÑóẊîöʼn
    ÑóẊîöʼn
    Reading sentinels post, i think the more likely cause of a singularity would be the intervention of a highly advanced ET lifeform, rather than a super-intelligent AI.
    A visitation by highly advanced aliens would constitute an Outside Context Problem rather than a Singularity, as I see no reason why an advanced alien civilisation would grant us any of their technology, although it is a possibility. But even then, the Singularity would come afterwards if we survived first contact. The more advanced the alien race, the more likely it is that they might destroy us, not necessarily out of malice, but because of the possibility of them doing the alien equivalent of bulldozing a termite mound to make way for construction. A civilisation capable of harnessing the energy needed to cross stellar or greater distances is a civilisation capable of doing great damage.

    I think there are too many conflicts of interest to allow cybernetic forms to become so intelligent.
    Conflict of interest simply means some people will be for AI and some against. It has no bearing on whether AI will actually be built or not. The true deciding factor is whether it's physically possible or not. Personally I think if AI is possible it will be a great boon to the human race, exceeding that of the Industrial Revolution and possibly even the wheel.

    That's not answer my question on the 'why'
    The fundamental point of the Singularity is that the normal rules for predicting technological advancement break down at some point. People living before the Industrial Revolution had no idea what the world would be like afterwards. Of course the Singularity will not happen with current means of production, but that may change, just as it changed during the Industrial Revolution.

    ---

    As for "becoming the Singularity" I believe that it is neither possible nor desireable. It is not possible because there will always be those who reject becoming transhuman or posthuman, and not desirable because a civilisation requires diversity to thrive, and that diversity includes "legacy systems" that are immune to any new problems by virtue of being so primitive. So, in effect, baseline humans will act as a "backup" in case things go horribly wrong. This is not to say that a significant portion of humanity will not "become the Singularity" in the sense that Sentinel means. I believe it is prudent to foster a viable breeding population of baseline humans, especially early on in the game.
  9. Sentinel
    Sentinel
    I believe it is prudent to foster a viable breeding population of baseline humans, especially early on in the game.
    But NoX, while I'm sure that wasn't your intention, don't you think that sounds a bit elitist? Firstly, I think your position is calling for class division in the post-rev society, as the altered will in many aspects be superior to the unaltered. Secondly, I regard the possible future alterations -- cybernetic, genetic, etc, which will make us 'posthuman' -- in the same light as the medical and educational breakthroughs that have occurred to this day, and made us 'post-cro magnon'.

    A post-revolutionary society should provide the alterations as the norm, as a human right. Sentient people who outright refuse to get them should be allowed to, but I do not think those will be many, provided the people are made aware of their benefits. Rather, those refusing to be altered are likely to be regarded as we today regard the occasional hermit weirdo, who chooses to isolate themselves from technological civilisation and modern medication.

    Then we do have the question of children. One good question is; should parents have the 'right' to produce unaltered offspring with say, a significantly shorter lifespan than genetic engineering would allow? Or ones who will be prone to diseases it could have made them immune towards?

    It may happen that a posthuman 'vanguard' is first to become the singularity, it's actually very likely indeed. But I think that we as socialists shouldn't see this as a positive development -- the people would literally be at the mercy of this vanguard.

    We should strive for a society where as many as possible will transcend and become part of the singularity.
  10. ÑóẊîöʼn
    ÑóẊîöʼn
    But NoX, while I'm sure that wasn't your intention, don't you think that sounds a bit elitist? I regard the possible future alterations -- cybernetic, genetic, etc, which will make us 'posthuman' -- in the same light as the medical and educational breakthroughs that have occurred to this day, and made us 'post-cro magnon'. A post-revolutionary society should provide the alterations as the norm, as a human right.
    Oh absolutely. But a certain amount of people, however small, will reject such advances. I do not think we should withdraw the right to refuse, however foolish on the individual's part it may be. Obviously there should be checks and balances to ensure that are rejecting such advances of their own free will and not because of indoctrination, pressure from others etc.

    I don't see how that is elitist.

    Sentient people who outright refuse to get them should be allowed to, but I do not think those will be many, provided the people are made aware of their benefits. Rather, those refusing to be altered are likely to be regarded as we today regard the occasional hermit weirdo, who chooses to isolate themselves from technological civilisation and modern medication.
    Right. You do realise that such people will be immune to computer viruses which could cripple a significant of trans/posthuman that incorporate cyborgism into their bodies? Or that a particular genemod could turn out to make one fatally vulnerable to a previously harmless disease? I accept that the odds of such are low, and concurrently I am not advocating keeping a large section of humanity in a baseline condition, only those few that choose to remain in (or return to) such a condition, whatever their ostensible motivations for doing so may be.

    Then we do have the question of children. One good question is; should parents have the 'right' to produce unaltered offspring with say, a significantly shorter lifespan than genetic engineering would allow? Or ones who will be prone to diseases it could have made them immune towards?
    Obviously not, as children have no choice in the circumstances of their birth. But there will be those who will in adulthood say "no more" and I think we should respect their wishes. It might even be possible later on for a trans/posthuman to revert to a baseline state, and I think we should allow for that as well. I reckon the amount of individuals who choose to revert to be in the baseline condition will be very small, perhaps even vanishingly so.

    It may happen that a posthuman 'vanguard' is first to become the singularity, it's actually very likely indeed. But I think that we as socialists shouldn't see this as a positive development -- the people would literally be at the mercy of this vanguard.
    I don't think the diversity of human (or Transhuman if you want to get particular) forms that are likely to arise during a Singularity should be treated as some sort of social hierarchy. In fact if anything the more "advanced" trans/posthumans will be doing more "work" than your average baseline, simply by virtue of being more competent. That is if Artificial Intelligences don't end up running the show, leaving the rest of Transhumanity to a life of leisure and diverting activity.

    We should strive for a society where as many as possible will transcend and become part of the singularity.
    I believe that diversity is the key, as the diversity of biological life has meant that it has lasted billions of years and weathered disasters that make anything human civilisation has encountered so far seem like a cakewalk. I believe that is an example to be emulated, and as in Nature, that includes "legacy" systems. Think of the "living fossils" such as the coelacanth that prove that ancient species can still be competitive in today's ecosystems.
  11. chimx
    chimx
    Noxion: I think this statement: "The fundamental point of the Singularity is that the normal rules for predicting technological advancement break down at some point."

    contradicts this statement: "Of course the Singularity will not happen with current means of production, but that may change, just as it changed during the Industrial Revolution."

    How can someone even expect this singularity to occur without even having an understanding of technological limitations in the future? What is the basis for such a belief?
  12. ÑóẊîöʼn
    ÑóẊîöʼn
    Noxion: I think this statement: "The fundamental point of the Singularity is that the normal rules for predicting technological advancement break down at some point."

    contradicts this statement: "Of course the Singularity will not happen with current means of production, but that may change, just as it changed during the Industrial Revolution."
    the second sentence is, I realise now, poorly worded and most likely superfluous.

    How can someone even expect this singularity to occur without even having an understanding of technological limitations in the future? What is the basis for such a belief?
    I presume it has something to do with accelerating rates of change. Technology has changed a lot more in the past 50 years than in the past 1000 or even 100.
  13. chimx
    chimx
    we are seeing an exponential rate of new technologies, but there is no reason to believe that this rate will continue at the same pace indefinitely. that is entirely conjecture that isn't founded on anything really.