I have raised the possibility that a SAI (including a provably friendly one, if that's possible) might destroy all life on earth.
By friendly, I mean doing what we tell it to do. Let's assume a best case scenario where all humans cooperate, so we don't ask, for example, for the SAI to kill or harm others. So under this scenario the SAI figures out how to end disease and suffering, make us immortal, make us smarter and give us a richer environment with more senses and more control, and give us anything we ask for. These are good things, right? So we achieve this by uploading our minds into super powerful computers, part of a vast network with millions of sensors and effectors around the world. The SAI does pre- and postprocessing on this I/O, so it effectively can simulate any enviroment if we want it to. If you don't like the world as it is, you can have it simulate a better one.
And by the way, there's no more need for living organisms to make all this run, is there? Brain scanning is easier if you don't have to keep the patient alive. Don't worry, no data is lost. At least no important data. You don't really need all those low level sensory processing and motor skills you learned over a lifetime. That was only useful when you still had your body. And while were at it, we can alter your memories if you like. Had a troubled childhood? How about a new one?
Of course there are the other scenarios, where the SAI is not proven friendly, or humans don't cooperate...
Vinge describes the singularity as the end of the human era. I think your nervousness is justified.
-- Matt Mahoney, [EMAIL PROTECTED]
By friendly, I mean doing what we tell it to do. Let's assume a best case scenario where all humans cooperate, so we don't ask, for example, for the SAI to kill or harm others. So under this scenario the SAI figures out how to end disease and suffering, make us immortal, make us smarter and give us a richer environment with more senses and more control, and give us anything we ask for. These are good things, right? So we achieve this by uploading our minds into super powerful computers, part of a vast network with millions of sensors and effectors around the world. The SAI does pre- and postprocessing on this I/O, so it effectively can simulate any enviroment if we want it to. If you don't like the world as it is, you can have it simulate a better one.
And by the way, there's no more need for living organisms to make all this run, is there? Brain scanning is easier if you don't have to keep the patient alive. Don't worry, no data is lost. At least no important data. You don't really need all those low level sensory processing and motor skills you learned over a lifetime. That was only useful when you still had your body. And while were at it, we can alter your memories if you like. Had a troubled childhood? How about a new one?
Of course there are the other scenarios, where the SAI is not proven friendly, or humans don't cooperate...
Vinge describes the singularity as the end of the human era. I think your nervousness is justified.
----- Original Message ----
From: deering <[EMAIL PROTECTED]>
To: [email protected]
Sent: Thursday, October 26, 2006 7:56:06 PM
Subject: Re: [singularity] Defining the Singularity
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]
From: deering <[EMAIL PROTECTED]>
To: [email protected]
Sent: Thursday, October 26, 2006 7:56:06 PM
Subject: Re: [singularity] Defining the Singularity
All this talk about trying to make a SAI Friendly
makes me very nervous. You're giving a superhumanly powerful being a set
of motivations without an underlying rationale. That's a
religion.
The only rational thing to do is to build an SAI
without any preconceived ideas of right and wrong, and let it figure it out for
itself. What makes you think that protecting humanity is the greatest good
in the universe?
This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]
