Maybe I don't understand why we need to explain the Singularity to others.  Years ago I realized that if we can make machines smarter than ourselves, then so can they.  I also realized that the growth of knowledge is faster than exponential over the long term.  The time constant gets shorter as we learn more.  If I made simple assumptions, the graph fits a hyperbola.  I did not know the name of this point of infinite knowledge, but it was apparent it would probably happen within the next 100 years.

The implications of super-smart machines raises disturbing questions.  Is the brain a computer, or does it do something that computers can't?  If you make an exact copy of yourself, then kill the "original", are you alive or not?  What if the copy is not exact?  What if the copy consists of just the knowledge encoded in your brain sitting in a file?  What if this knowledge is used in a computation which simulates your interactions with an external world, does this artificial word become your "awareness"?  How do you know this has not already happened?

Most people prefer not to think about such things.  We are programmed through evolution to propagate the species, which means reproduction and fearing things that can kill us.  This requires a type of behavior in which we believe that the world is real and behaves in predictable ways that we have some control over.  This is consistent with a belief in self awareness, that the brain does more than just compute.  This belief is universal.  You can't change it as long as you are dealing with human brains.

With the Singularity, all this goes out the window.  If a super-AGI realizes that thought is simply computation, then the logical decision, if it is friendly, might be to annihilate the human race and replace it with something "better".

As a practical matter, the Singularity is far enough into the future that we need not worry about it.  It should not come into our daily decision making.  If someone is interested in these ideas, I will talk about them, otherwise I won't.

-- Matt Mahoney, [EMAIL PROTECTED]


This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to