Jiri Jelinek wrote:
On Nov 2, 2007 4:54 AM, Vladimir Nesov <[EMAIL PROTECTED]> wrote:
You turn it into a tautology by mistaking 'goals' in general for
'feelings'. Feelings form one, somewhat significant at this point,
part of our goal system. But intelligent part of goal system is much
more 'complex' thing and can also act as a goal in itself. You can say
that AGIs will be able to maximize satisfaction of intelligent part
too,
Could you please provide one specific example of a human goal which
isn't feeling-based?
Saving your daughter's life. Most mothers would prefer to save their
daughter's life than to feel that they saved their daughter's life.
In proof of this, mothers sometimes sacrifice their lives to save
their daughters and never get to feel the result. Yes, this is
rational, for there is no truth that destroys it. And before you
claim all those mothers were theists, there was an atheist police
officer, signed up for cryonics, who ran into the World Trade Center
and died on September 11th. As Tyrone Pow once observed, for an
atheist to sacrifice their life is a very profound gesture.
--
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=60544283-64b657