On Nov 2, 2007 4:54 AM, Vladimir Nesov <[EMAIL PROTECTED]> wrote:
> You turn it into a tautology by mistaking 'goals' in general for
> 'feelings'. Feelings form one, somewhat significant at this point,
> part of our goal system. But intelligent part of goal system is much
> more 'complex' thing and can also act as a goal in itself. You can say
> that AGIs will be able to maximize satisfaction of intelligent part
> too,

Could you please provide one specific example of a human goal which
isn't feeling-based?

> as they are 'vastly more intelligent', but now it's turned into
> general 'they do what we want', which is generally what Friendly AI is
> by definition (ignoring specifics about what 'what we want' actually
> means).

Are you saying that we are unable to sufficiently express what we want?

Regards,
Jiri Jelinek

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=60527101-e3ae0d

Reply via email to