--- Tom McCabe <[EMAIL PROTECTED]> wrote:

> 
> --- Matt Mahoney <[EMAIL PROTECTED]> wrote:
> > Language and vision are prerequisites to AGI. 
> 
> No, they aren't, unless you care to suggest that
> someone with a defect who can't see and can't form
> sentences (eg, Helen Keller) is unintelligent.

Helen Keller had language.  One could argue that language alone is sufficient
for AI, as Turing did.  But everyone has a different opinion on what is AGI
and what isn't.

> Any future Friendly AGI isn't going to obey us exactly
> in every respect, because it's *more moral* than we
> are. Should an FAI obey a request to blow up the
> world?

That is what worries me.  I think it is easier to program an AGI for blind
obedience (its top level goal is to serve humans) than to program it to make
moral judgments in the best interest of humans, without specifying what that
means.  I gave this example on Digg.  Suppose the AGI (being smarter than us)
figures out that consciousness and free will are illusions of our biologically
programmed brains, and that there is really no difference between a human
brain and a simulation of a brain on a computer.  We may or may not have the
technology for uploading, but suppose the AGI decides (for reasons we don't
understand) that it doesn't need it.  Therefore it is in our best interest (or
irrelevant) to destroy the human race.

We cannot rule out this possibility because a lesser intelligence cannot
predict what a greater intelligence will do.  If you measure intelligence
using algorithmic complexity, then Legg proved this formally. 
http://www.vetta.org/documents/IDSIA-12-06-1.pdf

Or maybe an analogy would be more convincing.  Humans acting in the best
interests of their pets may put them down when they have a terminal disease,
or for other reasons they can't comprehend.  Who should make this decision? 
What will happen when the AGI is as advanced over humans as humans are over
dogs or insects or bacteria?  Perhaps the smarter it gets, the less relevant
human life will be.


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=8eb45b07

Reply via email to