--- Hank Conn <[EMAIL PROTECTED]> wrote:
> The further the actual target goal state of that particular AI is away from
> the actual target goal state of humanity, the worse.
> 
> The goal of ... humanity... is that the AGI implemented that will have the
> strongest RSI curve also will be such that its actual target goal state is
> exactly congruent to the actual target goal state of humanity.

This was discussed on the Singularity list.  Even if we get the motivational
system and goals right, things can still go badly.  Are the following things
good?

- End of disease.
- End of death.
- End of pain and suffering.
- A paradise where all of your needs are met and wishes fulfilled.

You might think so, and program an AGI with these goals.  Suppose the AGI
figures out that by scanning your brain and copying the information into a
computer and making many redundant backups, that you become immortal. 
Furthermore, once your consciousness becomes a computation in silicon, your
universe can be simulated to be anything you want it to be.

The "goals of humanity", like all other species, was determined by evolution. 
It is to propagate the species.  This goal is met by a genetically programmed
individual motivation toward reproduction and a fear of death, at least until
you are past the age of reproduction and you no longer serve a purpose. 
Animals without these goals don't pass on their DNA.

A property of motivational systems is that cannot be altered.  You cannot turn
off your desire to eat or your fear of pain.  You cannot decide you will start
liking what you don't like, or vice versa.  You cannot because if you could,
you would not pass on your DNA.

Once your brain is in software, what is to stop you from telling the AGI (that
you built) to reprogram your motivational system that you built so you are
happy with what you have?  To some extent you can do this.  When rats can
electrically stimulate their nucleus accumbens by pressing a lever, they do so
nonstop in preference to food and water until they die.

I suppose the alternative is to not scan brains, but then you still have
death, disease and suffering.  I'm sorry it is not a happy picture either way.


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to