RE: [agi] Re: AI boxing

2004-09-19 Thread Philip Sutton
Hi Ben, One thing I agree with Eliezer Yudkowsky on is: Worrying about how to increase the odds of AGI, nanotech and biotech saving rather than annihilating the human race, is much more worthwhile than worrying about who is President of the US. It's the nature of evolution that getting to

RE: [agi] Re: AI boxing

2004-09-19 Thread Arthur T. Murray
On Sun, 19 Sep 2004, Philip Sutton wrote: Hi Ben, One thing I agree with Eliezer Yudkowsky on is: Worrying about how to increase the odds of AGI, nanotech and biotech saving rather than annihilating the human race, is much more worthwhile than worrying about who is President of the US.

RE: [agi] Re: AI boxing

2004-09-19 Thread Ben Goertzel
:[EMAIL PROTECTED] Behalf Of Philip Sutton Sent: Sunday, September 19, 2004 8:06 AM To: [EMAIL PROTECTED] Subject: RE: [agi] Re: AI boxing Hi Ben, One thing I agree with Eliezer Yudkowsky on is: Worrying about how to increase the odds of AGI, nanotech and biotech saving rather than annihilating

[agi] Re: AI boxing

2004-09-17 Thread Arthur T. Murray
On Fri, 17 Sep 2004, Ben Goertzel wrote: [...] In short, it really makes no sense to create an AI, allow it to indirectly affect human affairs, and then make an absolute decision to keep it in a box. And it also makes no sense to create an AI and not allow it to affect human affairs at

RE: [agi] Re: AI boxing

2004-09-17 Thread Ben Goertzel
- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Arthur T. Murray Sent: Friday, September 17, 2004 10:13 AM To: [EMAIL PROTECTED] Subject: [agi] Re: AI boxing On Fri, 17 Sep 2004, Ben Goertzel wrote: [...] In short, it really makes no sense to create an AI, allow