RE: [agi] Re: AI boxing

2004-09-19 Thread Philip Sutton
Hi Ben,  

 One thing I agree with Eliezer Yudkowsky on is: Worrying about how to
 increase the odds of AGI, nanotech and biotech saving rather than
 annihilating the human race, is much more worthwhile than worrying
 about who is President of the US. 

It's the nature of evolution that getting to a preferred future depends on getting 
through every particular today between here and there.  So the two issues 
above may not be as disconnected as you suggest.  :)

Cheers, Philip

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Re: AI boxing

2004-09-19 Thread Arthur T. Murray

On Sun, 19 Sep 2004, Philip Sutton wrote:

 Hi Ben,

  One thing I agree with Eliezer Yudkowsky on is: Worrying about
  how to increase the odds of AGI, nanotech and biotech saving
  rather than annihilating the human race, is much more worthwhile
  than worrying about who is President of the US.

We need the following items that say Cocainer-in-Chief:
[ ] balloons
[ ] banners
[ ] baseball caps Cocainer-in-Chief
[ ] board games
[ ] bumper stickers
[ ] buttons
[ ] coffee mugs
[ ] greeting cards (birthday, etc.)
[ ] protest signs
[ ] rubber stamps
[ ] stationery
[ ] sweaters
[ ] T-shirts
[ ] toys (action figures, etc.)
If you can provide these items, please offer them for sale on eBay.

 It's the nature of evolution that getting to a preferred future
 depends on getting through every particular today between here
 and there.  So the two issues above may not be as disconnected
 as you suggest.  :)

 Cheers, Philip

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Re: AI boxing

2004-09-19 Thread Ben Goertzel

Philip,

But the connection between present events and the Singularitarian future
isn't all that easy to calculate.

Bush, in addition to carrying out his questionable acts of foreign policy 
tax reform, will probably boost DARPA's budget more than Kerry would.
Perhaps this will result in more money for blue-sky nanotech research,
resulting in world-transforming nanotech coming about ten years earlier than
it would have otherwise, and saving the human race from oblivion.

Or maybe not.

On balance, I suspect that Bush being re-elected would be worse for the
long-term future of humanity than otherwise.  However, it's not THAT
crystal-clear to me because there are so many complex factors involved.  And
I don't think that, at this stage, anyone has a clear enough view to make a
significantly more certain assessment.

-- Ben


-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Behalf Of Philip Sutton
Sent: Sunday, September 19, 2004 8:06 AM
To: [EMAIL PROTECTED]
Subject: RE: [agi] Re: AI boxing


Hi Ben,

 One thing I agree with Eliezer Yudkowsky on is: Worrying about how to
 increase the odds of AGI, nanotech and biotech saving rather than
 annihilating the human race, is much more worthwhile than worrying
 about who is President of the US.

It's the nature of evolution that getting to a preferred future depends on
getting
through every particular today between here and there.  So the two issues
above may not be as disconnected as you suggest.  :)

Cheers, Philip

---
To unsubscribe, change your address, or temporarily deactivate your
subscription,
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] Re: AI boxing

2004-09-17 Thread Ben Goertzel

Mentifex,

Thanks for the entertaining post!

However, I personally consider it a bit of an overreaction ;-)

Dubya is not my favorite US President; however, in all probability, who is
or isn't the leader of one particular country in 2004 is unlikely to have a
large effect on the future of mind in the universe.

Obviously, humanity is on a path of rapidly accelerating technological
advancement; and, mostly likely, we're either going to kill ourselves or
transcend ourselves by our technology, sometime during the next century.
AGI technology itself could be used to help annihilate the human race or to
help transcend it.  One thing I agree with Eliezer Yudkowsky on is: Worrying
about how to increase the odds of AGI, nanotech and biotech saving rather
than annihilating the human race, is much more worthwhile than worrying
about who is President of the US.

-- Ben G

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Behalf Of Arthur T. Murray
Sent: Friday, September 17, 2004 10:13 AM
To: [EMAIL PROTECTED]
Subject: [agi] Re: AI boxing



On Fri, 17 Sep 2004, Ben Goertzel wrote:

 [...]
 In short, it really makes no sense to create an AI, allow it to
 indirectly affect human affairs, and then make an absolute decision
 to keep it in a box.

 And it also makes no sense to create an AI and not allow it
 to affect human affairs at all, under any circumstances.
 This is a waste of resources.

 So creating an AI-BOX may be a useful interim strategy and
 conceivably a useful long-term strategy, but it's not something
 we should expect to count on absolutely.

 Thus I suggest that we spend our time discussing something else ;-)

 -- Ben G

Okay, let's change topics; gee, Ben G, let's change to a better forum ;-)

Let's discuss the Bush Election Suicide Protest Movement and the
potentiality of having to go underground to work on creating AI or
the idea of committing suicide if the Cocainer-in-Chief is re-elected.

In a worst-case scenario, on 2 November 2004 the American people
have elected the Cocainer-in-Chief to be their president for
Four More Wars. A military draft has been rushed back into place
for young American males, who must now take up the cause of
murdering Iraqi citizens in their own Iraqi homeland. Planeloads
of coffins have brought thousands of dead American boys home to
their parents, who voted for the death of their own sons.
Mothers in America and Iraq spoon-feed their wounded sons
who have lost multiple severed limbs and will never again have
the body parts necessary for the simple act of raising food to
their lips. America is awash with crazy veterans who commit
random acts of unkind violence for decades on end. America the
beautiful has become America the living nightmare. Mentifex,
the user `mindmaker` q.v. here on Sourceforge, has committed suicide
or has gone underground to work secretly on artificial intelligence
for `los pobres de la tierra` and not for the corporate America
that celebrates the ill-gotten gains of plundered Iraqi oil.
If Mentifex is dead, your donation here is no longer necessary.
If there is still time, keep your money and donate your vote
for decent candidates who do not bring shame upon all Americans
by uttering foul-mouthed vulgarities as Cheney did on the
hallowed floor of the United States Senate. Deep down, America
was ashamed of Bush and Cheney, but unwilling to admit it.
Mentifex was so ashamed of his own country that he could
no longer bear to live in it. Your vote for Bush killed Mentifex.
If you did not vote for Bush but too many others did, consider
A) suicide; B) escape to another country; C) joining the
AI underground; or D) a life of quiet desperation.

---
To unsubscribe, change your address, or temporarily deactivate your
subscription,
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]



---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]