Hi Arthur,
On Wed, 12 Feb 2003, Arthur T. Murray wrote:
. . .
Since the George and Barbara Bushes of this world
are constantly releasing their little monsters onto the planet,
why should we creators of Strong AI have to take any
more precautions with our Moravecian Mind Children
than human
I don't think any human alive has the moral and ethical underpinnings to allow them to
resist the corruption of absolute power in the long run. We are all kept in check by
our lack of power, the competition of our fellow humans, the laws of society, and the
instructions of our peers. Remove
Arthur T. Murray wrote:
[snippage]
why should we creators of Strong AI have to take any
more precautions with our Moravecian Mind Children
than human parents do with their human babies?
Here are three reasons I can think of, Arthur:
1) Because we know in advance that 'Strong AI', as you
Brad Wyble wrote:
I don't think any human alive has the moral and ethical underpinnings
to allow them to resist the corruption of absolute power in the long
run.
I am exceedingly glad that I do not share your opinion on this. Human
altruism *is* possible, and indeed I observe myself
I am exceedingly glad that I do not share your opinion on this. Human
altruism *is* possible, and indeed I observe myself possessing a
significant measure of it. Anyone doubting thier ability to 'resist
corruption' should not IMO be working in AGI, but should be doing some
serious
On Wed, 12 Feb 2003, Arthur T. Murray wrote:
The quest is as hopeless as it is with human children.
Although Bill Hibbard singles out the power of super-intelligence
as the reason why we ought to try to instill morality and friendliness
in our AI offspring, such offspring are made in our own
I can't imagine the military would be interested in AGI, by its very definition. The
military would want specialized AI's, constructed around a specific purpose and under
their strict control. An AGI goes against everything the military wants from its
weapons and agents. They train soldiers
Brad Wyble wrote:
Tell me this, have you ever killed an insect because it bothered you?
In other words, posthumanity doesn't change the goal posts. Being human
should still confer human rights, including the right not to be enslaved,
eaten, etc.. But perhaps being posthuman will confer
minded enough to consider others thoughts\concerns. That will mean alot as
this project progresses towards completion..
Kevin
- Original Message -
From: Eliezer S. Yudkowsky [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Wednesday, February 12, 2003 1:53 PM
Subject: Re: [agi] AI
- Original Message -
From:
Philip Sutton
To: [EMAIL PROTECTED]
Sent: Wednesday, February 12, 2003 2:55
PM
Subject: Re: [agi] AI Morality -- a
hopeless quest
Brad,
Maybe what you
said below is the key to friendly GAI
I don't think any human
On Wed, 12 Feb 2003, Brad Wyble wrote:
I can't imagine the military would be interested in AGI, by its very
definition. The military would want specialized AI's, constructed
around a specific purpose and under their strict control. An AGI goes
against everything the military wants from its
As has been pointed out on this list before, the military IS interested in
AGI, and primarily for information integration rather than directly
weapons-related purposes.
See
http://www.darpa.mil/body/NewsItems/pdf/iptorelease.pdf
for example.
-- Ben G
I can't imagine the military would be
Brad Wyble wrote:
Under the ethical code you describe, the AGI would swat
them like a bug with no more concern than you swatting a mosquito.
I did not describe an ethical code, I described two scenarios about a
human (myself) then suggested the non-bug-swatting scenario was
possible,
Daniel,
For a start look at the IPTO web page and links from:
http://www.darpa.mil/ipto/research/index.html
Darpa has a variety of Offices which sponsor AI related work, but IPTO is
now being run by Ron Brachman, the former president of the AAAI. When I
listened to the talk he gave a Cycorp in
Steve, Ben, do you have any gauge as too what kind of grants are hot
right now or what kind of narrow AI projects with AGI implications have
recently been funded through military agencies?
The list would be very long. Just look at the DARPA IPTO website for
starters...
15 matches
Mail list logo