"A single genetically engineered child born with a substantially
smarter-than-human IQ would constitute a Singularity"
smarter-than-human IQ would constitute a Singularity"
That is a flaw in your definition.
The all-encompassing definition of the Singularity is the point at which an intelligence gains the ability to recursively self-improve the underlying computational processes of its intelligence.
Period.
-hank
On 10/10/06, Michael Anissimov <[EMAIL PROTECTED]> wrote:
The Singularity definitions being presented here are incredibly
confusing and contradictory. If I were a newcomer to the community
and saw this thread, I'd say that this word "Singularity" is so poorly
defined, it's useless. Everyone is talking past each other. As Nick
Hay has pointed out, the Singularity was originally defined as
smarter-than-human intelligence, and I think that this definition
remains the most relevant, concise, and resistant to
misinterpretation.
It's not about technological progress. It's not about experiencing an
artificial universe by being plugged into a computer. It's not about
human intelligence merging with computing technology. It's not about
things changing so fast that we can't keep up, or the accretion of
some threshold level of knowledge. All of these things *might* indeed
follow from a Singularity, but might not, making it important to
distinguish between the likely *effects* of a Singularity and *what
the Singularity actually is*. The Singularity *actually is* the
creation of smarter-than-human intelligence, but there are many
speculative scenarios about what would happen thereafter as there are
people who have heard about the idea.
The number of completely incompatible Singularity definitions being
tossed around on this list underscores the need for a return to the
original, simple, and concise definition, which, in that it doesn't
make a million and one side claims, is also the easiest to explain to
those being exposed to the idea for the first time. We have to define
our terms to have a productive discussion, and the easiest way to
define a contentious term is to make the definition as simple as
possible. The reason that so many in the intellectual community see
Singularity discussion as garbage is because there is so little
definitional consensus that it's close to impossible to determine
what's actually being discussed.
Smarter-than-human intelligence. That's all. Whether it's created
through Artificial Intelligence, Brain-Computer Interfacing,
neurosurgery, genetic engineering, or the fundamental particles making
up my neurons quantum-tunneling into a smarter-than-human
configuration - the Singularity is the point at which our ability to
predict the future breaks down because a new character is introduced
that is different from all prior characters in the human story.
The creation of smarter-than-human intelligence is called "the
Singularity" by analogy to a gravitational singularity, not a
mathematical singularity. Nothing actually goes to infinity. In
physics, our models of black hole spacetimes spit out infinities
because they're fundamentally flawed, not because nature itself is
actually producing infinities. Any relationship between the term
Singularity and the definition of singularity that means "the quality
of being one of a kind" is coincidental.
The analogy of our inability to predict the physics past the event
horizon of a black hole with the creation of superintelligence is apt,
because we know for a fact that our minds are conditioned, both
genetically and experientially, to predict the actions of other human
minds, not smarter-than-human minds. We can't predict what a
smarter-than-human mind would think or do, specifically. But we can
predict it in broad outlines - we can confidently say that a
smarter-than-human intelligence will 1) be smarter-than-human (by
definition), 2) have all the essential properties of an intelligence,
including the ability to model the world, make predictions, synthesize
data, formulate beliefs, etc., 3) have starting characteristics
dictated by the method of its creation, 4) have initial motivations
dictated by its prior, pre-superintelligent form, 5) not necessarily
display characteristics similar to its human predecessors, and so on.
We can predict that a superintelligence would likely be capable of
putting a lot of optimization pressure behind its goals.
The basic Singularity concept is incredibly mundane. In the midst of
all this futuristic excitement, we sometimes forget this. A single
genetically engineered child born with a substantially
smarter-than-human IQ would constitute a Singularity, because we would
have no ability to predict the specifics of what it would do, whereas
we have a much greater ability to predict the actions of typical
humans. It's also worth pointing out that the Singularity is an
event, like the first nuclear test, not a thing, like the first nuke
itself. It heralds an irreversible transition to a new era, but our
guesses at the specifics of that era are inextricably tied to the real
future conditions under which we make that transition.
The fact that it is sometimes difficult to predict the actions of
everyday humans does not doom this definition of the Singularity. The
fact that "smarter-than-human" is a greyscale rather than
black-and-white does not condemn it either. The Singularity is one of
those things that we'd probably recognize if we saw it, but because it
hasn't happened yet it's very difficult to talk about coherently.
The Singularity is frequently associated with technology simply
because technology is the means by which agents that can't mold their
environments directly are able to get things done in a limited time.
So by default, we assume that a superintelligence would use technology
to get things done, and use a lot of it. But there are possible
beings that need no technology to accomplish significant goals. For
example, in the future there might be a being that can build a nuclear
reactor simply by swallowing uranium and internally processing it into
the right configuration. No "technology" required.
The Singularity would still be possible if technological process were
slowed down or halted. It would still be possible (albeit difficult)
if every computer on the planet were smashed to pieces. It would be
possible even if it turned out that intelligence can't exist inside a
computer.
A Singularity this century could easily be stopped, for example if a
disease wiped out half of humanity, or a global authoritarian regime
forbade research in that direction, or if a nuclear war ejected
sufficient dust into the air to shut down photosynthesis. The
Singularity is far from inevitable.
The Singularity can be a bad thing, resulting in the death of all
human beings, or a good thing, such that every single human being on
earth can explicitly say that they are glad that it happened. There
are also different shades of good: for example, a Singularity that
results in the universal availability of "genie machines" could
eliminate all journeys of value, by taking us right to the destination
whether we want it or not.
As we can see, this definition of the Singularity I'm presenting
encompasses a lot of possibilities. That's part of the elegance of
it. By making a minimal amount of assumptions, it requires the least
amount of evidence to back it up. All it requires is that humans
aren't the smartest physically possible beings in the universe, and
that we will some day have the ability to either upgrade our brains,
or create new brains that are smarter than us by design.
--
Michael Anissimov
Lifeboat Foundation http://lifeboat.com
http://acceleratingfuture.com/michael/blog
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]
This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]
