Hi Ben,
One thing I agree with Eliezer Yudkowsky on is: Worrying about how to
increase the odds of AGI, nanotech and biotech saving rather than
annihilating the human race, is much more worthwhile than worrying
about who is President of the US.
It's the nature of evolution that getting to
On Sun, 19 Sep 2004, Philip Sutton wrote:
Hi Ben,
One thing I agree with Eliezer Yudkowsky on is: Worrying about
how to increase the odds of AGI, nanotech and biotech saving
rather than annihilating the human race, is much more worthwhile
than worrying about who is President of the US.
:[EMAIL PROTECTED]
Behalf Of Philip Sutton
Sent: Sunday, September 19, 2004 8:06 AM
To: [EMAIL PROTECTED]
Subject: RE: [agi] Re: AI boxing
Hi Ben,
One thing I agree with Eliezer Yudkowsky on is: Worrying about how to
increase the odds of AGI, nanotech and biotech saving rather than
annihilating
On Fri, 17 Sep 2004, Ben Goertzel wrote:
[...]
In short, it really makes no sense to create an AI, allow it to
indirectly affect human affairs, and then make an absolute decision
to keep it in a box.
And it also makes no sense to create an AI and not allow it
to affect human affairs at
-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Behalf Of Arthur T. Murray
Sent: Friday, September 17, 2004 10:13 AM
To: [EMAIL PROTECTED]
Subject: [agi] Re: AI boxing
On Fri, 17 Sep 2004, Ben Goertzel wrote:
[...]
In short, it really makes no sense to create an AI, allow