RE: [agi] Re: AI boxing
Hi Ben, One thing I agree with Eliezer Yudkowsky on is: Worrying about how to increase the odds of AGI, nanotech and biotech saving rather than annihilating the human race, is much more worthwhile than worrying about who is President of the US. It's the nature of evolution that getting to a preferred future depends on getting through every particular today between here and there. So the two issues above may not be as disconnected as you suggest. :) Cheers, Philip --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
RE: [agi] Re: AI boxing
On Sun, 19 Sep 2004, Philip Sutton wrote: Hi Ben, One thing I agree with Eliezer Yudkowsky on is: Worrying about how to increase the odds of AGI, nanotech and biotech saving rather than annihilating the human race, is much more worthwhile than worrying about who is President of the US. We need the following items that say Cocainer-in-Chief: [ ] balloons [ ] banners [ ] baseball caps Cocainer-in-Chief [ ] board games [ ] bumper stickers [ ] buttons [ ] coffee mugs [ ] greeting cards (birthday, etc.) [ ] protest signs [ ] rubber stamps [ ] stationery [ ] sweaters [ ] T-shirts [ ] toys (action figures, etc.) If you can provide these items, please offer them for sale on eBay. It's the nature of evolution that getting to a preferred future depends on getting through every particular today between here and there. So the two issues above may not be as disconnected as you suggest. :) Cheers, Philip --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
RE: [agi] Re: AI boxing
Philip, But the connection between present events and the Singularitarian future isn't all that easy to calculate. Bush, in addition to carrying out his questionable acts of foreign policy tax reform, will probably boost DARPA's budget more than Kerry would. Perhaps this will result in more money for blue-sky nanotech research, resulting in world-transforming nanotech coming about ten years earlier than it would have otherwise, and saving the human race from oblivion. Or maybe not. On balance, I suspect that Bush being re-elected would be worse for the long-term future of humanity than otherwise. However, it's not THAT crystal-clear to me because there are so many complex factors involved. And I don't think that, at this stage, anyone has a clear enough view to make a significantly more certain assessment. -- Ben -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Philip Sutton Sent: Sunday, September 19, 2004 8:06 AM To: [EMAIL PROTECTED] Subject: RE: [agi] Re: AI boxing Hi Ben, One thing I agree with Eliezer Yudkowsky on is: Worrying about how to increase the odds of AGI, nanotech and biotech saving rather than annihilating the human race, is much more worthwhile than worrying about who is President of the US. It's the nature of evolution that getting to a preferred future depends on getting through every particular today between here and there. So the two issues above may not be as disconnected as you suggest. :) Cheers, Philip --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED] --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
RE: [agi] Psychometric AI
I noticed that too. Seemed like this list doesn't archive attachments (or has particularly good SPAM filter :-). I don't have the paper posted on any site. Will send you a PDF (748 KB). If others want a copy, let me know via email. Thanks! J. W. Hi Please send me a copy too, thanks. YKY -- ___ Find what you are looking for with the Lycos Yellow Pages http://r.lycos.com/r/yp_emailfooter/http://yellowpages.lycos.com/default.asp?SRC=lycos10 --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]