RE: [agi] Re: AI boxing

2004-09-19 Thread Philip Sutton
Hi Ben, One thing I agree with Eliezer Yudkowsky on is: Worrying about how to increase the odds of AGI, nanotech and biotech saving rather than annihilating the human race, is much more worthwhile than worrying about who is President of the US. It's the nature of evolution that getting to

RE: [agi] Re: AI boxing

2004-09-19 Thread Arthur T. Murray
On Sun, 19 Sep 2004, Philip Sutton wrote: Hi Ben, One thing I agree with Eliezer Yudkowsky on is: Worrying about how to increase the odds of AGI, nanotech and biotech saving rather than annihilating the human race, is much more worthwhile than worrying about who is President of the US.

RE: [agi] Re: AI boxing

2004-09-19 Thread Ben Goertzel
:[EMAIL PROTECTED] Behalf Of Philip Sutton Sent: Sunday, September 19, 2004 8:06 AM To: [EMAIL PROTECTED] Subject: RE: [agi] Re: AI boxing Hi Ben, One thing I agree with Eliezer Yudkowsky on is: Worrying about how to increase the odds of AGI, nanotech and biotech saving rather than annihilating

RE: [agi] Re: AI boxing

2004-09-17 Thread Ben Goertzel
Mentifex, Thanks for the entertaining post! However, I personally consider it a bit of an overreaction ;-) Dubya is not my favorite US President; however, in all probability, who is or isn't the leader of one particular country in 2004 is unlikely to have a large effect on the future of mind