Alan Grimes wrote:
> My position is that you don't really need friendly AI, you simply need
> to neglect to include the "take over world" motovator...
>

I think that is a VERY bad approach !!!

I don't want a superhuman AGI to destroy us by accident or through
indifference... which are possibilities just as real as aggression.

-- Ben G

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]

Reply via email to