Singularitarian Principles

  1. ÑóẊîöʼn
    ÑóẊîöʼn
    What do fellow HGPers make of this?

    From HERE

    Definitional Principles:

    Singularity

    Activism

    Ultratechnology

    Globalism


    Descriptive Principles:


    Apotheosis

    Solidarity

    Intelligence

    Independence

    Nonsuppression



    Definitional Principles:


    These are the qualities which define the term "Singularitarian". As with any statement about human beings, each Principle may hold true of an individual Singularitarian to a greater or lesser degree. I have no particular authority to write the definition, but someone has to do it.
    (Commentary.)

    Singularity

    The "Singularity" has been defined many different ways. The primary and original definition, as invented by Vernor Vinge, is that the Singularity is the fundamental discontinuity in history created by the technological invention of smarter-than-human intelligence. Other definitions have included a time of exponentially faster technological progress (even faster than now, that is), or the positive-feedback effect created by enhanced intelligences working out improved methods of intelligence enhancement. The core idea remains the same: There is a massive discontinuity approaching, a Singularity, within human history. This has to do with the rise of smarter-than-human intelligence, the ability of technology to alter human nature, the final conquest of material reality through nanotech, or some other fundamental change in the rules.
    A Singularitarian believes that the Singularity is possible, that the Singularity is a good thing, and that we should help make it happen (see Activism).
    (Commentary.)

    Activism

    Singularitarians are the partisans of the Singularity. A Singularitarian is someone who believes that technologically creating a greater-than-human intelligence is desirable, and who works to that end.
    A Singularitarian is advocate, agent, defender, and friend of the future known as the Singularity.
    See also the Singularitarian mailing list and the Singularity Institute.
    (Commentary.)

    Ultratechnology

    The "Singularity" is a natural, non-mystical, technologically triggered event. We, the Singularitarians, are allied in the purpose of bringing about a natural event through natural means, not sitting in a circle chanting over a computer. There are thousands, perhaps millions, of stories and prophecies and rituals that allegedly involve something that could theoretically be described as "greater-than-human intelligence". What distinguishes the Singularitarians is that we want to bring about a natural event, working through ultratechnologies such as AI or nanotech, without relying on mystical means or morally valent effects. (Commentary.)

    Globalism

    The principle of Globalism is a subtle distinction that marks the difference between the terms "Singularitarian" and "posthumanist". It's possible to want to bring about an event that would qualify as a "Singularity" without being a "Singularitarian". Someone who thinks that the first uploadee will win all the marbles and leave the rest of humanity out in the cold, and who wants to personally be that first upload, is trying to bring about an event that would qualify as the Singularity... but she (1) is not a Singularitarian. A posthumanist, but not a Singularitarian. Perhaps the best analogy is to "liberty" and "libertarian". Being a "libertarian" doesn't mean that you advocate liberty only for yourself, but for a society. You can be a libertarian from first moral principles, or as a matter of pragmatism, or because your astrologer told you to. Motivations have nothing to do with the definition, which is simply that a libertarian is someone who advocates liberty for everyone. Someone who advocates liberty only for himself could as easily be in favor of autocracy, theocracy, monarchy, dictatorship... just about anything, actually.
    Similarly, although the Singularity is simply the creation of greater-than-human intelligence, the "Singularity" in "Singularitarian" is the Singularity as seen from the perspective of the vast majority of humanity. It's the event seen from a global perspective, just like the "liberty" in "libertarian" is global. If you don't advocate global liberty, you aren't a libertarian. If you don't advocate global Singularity, if you just advocate a personal, private Singularity, then you're not a Singularitarian.
    (Commentary.)

    Descriptive Principles:

    These are items which aren't strictly necessary to the definition, but which form de facto parts of the Singularitarian meme. These Principles should be considered as descriptions, rather than tests. (Commentary.)

    Apotheosis

    The Singularity holds out the possibility of winning the Grand Prize, the true Utopia, the best-of-all-possible-worlds - not just freedom from pain and stress or a sterile round of endless physical pleasures, but the prospect of endless growth for every human being - growth in mind, in intelligence, in strength of personality; life without bound, without end; experiencing everything we've dreamed of experiencing, becoming everything we've ever dreamed of being; not for a billion years, or ten-to-the-billionth years, but forever... or perhaps embarking together on some still greater adventure of which we cannot even conceive. That's the Apotheosis. If any utopia, any destiny, any happy ending is possible for the human species, it lies in the Singularity.
    There is no evil I have to accept because "there's nothing I can do about it". There is no abused child, no oppressed peasant, no starving beggar, no crack-addicted infant, no cancer patient, literally no one that I cannot look squarely in the eye. I'm working to save everybody, heal the planet, solve all the problems of the world.
    (Commentary.)

    Solidarity

    The Singularity belongs to humanity. It's our task as a species. I think the best analogy is to an infant waiting to be born. If infants were aware, before their birth, it would undoubtedly take courage to journey down the birth canal without knowing what lay on the other side - without knowing if anything at all lay on the other side. But we can't stay in the womb forever, and if we could, it'd be pointless. It would take courage for a newborn to step into a future that was new, and strange, and unfamiliar - but the real world is out here. Or in the case of the Singularity, out there.
    At this point, nobody really knows what happens after the Singularity - but whatever happens, it was the whole point of having a human species in the first place.
    (Commentary.)

    Intelligence

    At the heart of an appreciation of the Singularity lies an appreciation of intelligence. The Singularity places a horizon across our understanding because we can't predict what someone smarter than us is going to do; if we could, we'd be that smart ourselves. Intelligence isn't just the ability to come up with complex solutions to complex problems; it's the ability to see the shortcuts, the simple and obvious-in-retrospect solutions to complex problems - even emotional or philosophical problems. Intelligence isn't just high-speed thinking, perfect memories, or other party tricks; intelligence is also wisdom, and self-awareness, and other things that extend into every aspect of mind and personality. It's transhuman intelligence that lies at the heart of the Singularity, but we respect intelligence on the human scale as well. This respect for intelligence is our shield against being blinded by ideology, one of the primary safeguards that prevents Singularitarianism from turning into just another banal fanaticism. Cloudy thinking in the service of the Singularity wouldn't be a virtue; it'd be another dreary manifestation of the same old human stupidity that we're trying to get away from.
    Of course, it's not just enough to make a commitment to intelligence. You have to back it up with the ability to perceive intelligence, and to distinguish it from stupidity. That high art is a topic for some other page.
    But making the commitment is also important.
    (Commentary.)

    Independence

    Independence means regarding the Singularity as a personal goal. The desire to create the Singularity is not dependent on the existence, assistance, permission, or encouragement of other Singularitarians. If every other Singularitarian on the planet died in a tragic trucking accident, the last remaining Singularitarian would continue trying to make the Singularity happen. A Singularitarian is a friend of the Singularity - not a friend of Singularitarianism, or the Singularity meme, or some Singularitarian leader. We defend the Singularity itself. Nothing less.
    (Commentary.)

    Nonsuppression

    Even though some of us may have different opinions about the wisdom of certain technologies - I'm not very fond of nanotech, and I know of at least one transhumanist who's strongly against AI - we still don't believe in the suppression of technology. Partially this is because the intellectual heritage of Singularitarianism derives from transhumanism, the scientific community, the science-fictional community, the computing community, technophiles, and other groups which despise the concept of legally enforced technophobia. Also:

    Trying to suppress technologies tends not to work.

    If I can try to suppress nanotech, someone else can try to suppress AI, so we'll all be better off nobody resorts to fisticuffs.

    Any attempt to regulate one technology sets a trend.

    And then there's just general niceness. Nonsuppression - or rather, the non-initiation of suppression - helps maintain cooperation among technophiles, so it's an auxiliary ethic. (Commentary.)

    1: I flip a coin to determine whether a human is male or female. AIs and posthumans are "ve".

    ---

    It seems sound to me, as a starting point perhaps. What do the rest of you think?
  2. Invincible Summer
    Invincible Summer
    Curious - why are you against nanotech, and why would a transhumanist be against AI?
  3. ÑóẊîöʼn
    ÑóẊîöʼn
    Curious - why are you against nanotech, and why would a transhumanist be against AI?
    I'm not against against nanotech, I just copy + pasted the article from somewhere else, the original author is expressing his opinion.

    There are, I feel, perfectly valid reasons for Transhumanists to oppose AI, even if I do not agree with them myself. A major issue with regards to AI is the so-called "friendliness problem". It is one thing to create AI, but it is quite another thing to create AI that does not end up biting us all in the arse in some fashion. Unlike the modification of human beings and other animals, which have things like empathy, "common sense" and whatnot already wired in unless you specifically remove it, such things have to be specifically implemented in the design of an Artificial Intelligence.

    The reason I am not opposed to the development of AI is because I believe the friendliness problem is not insurmountable, and that the potential benefits outweigh the potential risks.
  4. Elect Marx
    Elect Marx
    I'm impressed NoX. Back when we argued about super-technology (way... back), I had the impression that you thought risk-benefit analysis of new technology was absurd. I do think nanotech is likely the most dangerous and fundamentaly significant area for the future. That said, it is really an irrelevant question about pursuing it. I have read a bit on developments, and like it or not, it is coming. As long as mega-corporations have the ability to do it, they will be delving into nanotech. I do hope it doesn't kill us all and cannot be controlled by the capitalists; but what's new?

    Sorry to run off on a tangent, I am also interested in discussing the singularity in depth.
  5. ÑóẊîöʼn
    ÑóẊîöʼn
    I'm impressed NoX. Back when we argued about super-technology (way... back), I had the impression that you thought risk-benefit analysis of new technology was absurd. I do think nanotech is likely the most dangerous and fundamentaly significant area for the future. That said, it is really an irrelevant question about pursuing it. I have read a bit on developments, and like it or not, it is coming. As long as mega-corporations have the ability to do it, they will be delving into nanotech. I do hope it doesn't kill us all and cannot be controlled by the capitalists; but what's new?

    Sorry to run off on a tangent, I am also interested in discussing the singularity in depth.
    Well, nanotech is likely to play a part in the Singularity.

    With regards to risk-benefit analysis; I think it's naive to use the results of such things as an excuse to ban a given technology - if a risk-benefit analysis shows that a certain technology has a lot of risks attached to it, then we should take that as a warning to be careful, rather than shrink back and reflexively ban research into the technology - banning research simply means that it moves to areas with more liberal views on such things, and perhaps places with laxer safety standards too! No, if something is potentially risky, then it behooves us to take reasonable precautions when developing, researching and employing it. Perhaps during our investigations we can find ways of making it safer - something that fans of the "precautionary principle" often seem to forget.

    Essentially, what we must do is strike a balance between being carelessly reckless and being overly timid.