Alan Grimes wrote:
> You have not shown this at all. From everything you've said it seems
> that you are trying to trick Ben into having so many misgivings about
> his own work that he holds it up while you create your AI first. I hope
> Ben will see through this deception and press ahead with novamente. -- A
> project that I give even odds for sucess...

Alan,

Eliezer knows me well enough to know there's no chance he's going to induce
me to stop doing my work ;)

Also, if he managed to create an AGI first, I'd be happy, not jealous, so
long as it was a good AGI.

I would like to be the first one to create a powerful AGI.  However, it's
vastly more important to me that a powerful, beneficent AI is created, than
that I be the one who creates it.

Perhaps living in Washington has made me a little paranoid, but I am
continually aware of the increasing threats posed by technology to
humanity's survival.  I often think of humanity's near-term future as a race
between destructive and constructive technologies.  I really hope Friendly
AI can outpace, for example, bio-engineered pathogens...

-- Ben G

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]

Reply via email to