Jiri, You turn it into a tautology by mistaking 'goals' in general for 'feelings'. Feelings form one, somewhat significant at this point, part of our goal system. But intelligent part of goal system is much more 'complex' thing and can also act as a goal in itself. You can say that AGIs will be able to maximize satisfaction of intelligent part too, as they are 'vastly more intelligent', but now it's turned into general 'they do what we want', which is generally what Friendly AI is by definition (ignoring specifics about what 'what we want' actually means).
On 11/2/07, Jiri Jelinek <[EMAIL PROTECTED]> wrote: > > Is this really what you *want*? > > Out of all the infinite possibilities, this is the world in which you > > would most want to live? > > Yes, great feelings only (for as many people as possible) and the > "engine" being continuously improved by AGI which would also take care > of all related tasks including safety issues etc. The quality of our > life is in feelings. Or do we know anything better? We do what we do > for feelings and we alter them very indirectly. We can optimize and > get the greatest stuff allowed by the "current" design by direct > altering/stimulations (changes would be required so we can take it > non-stop). Whatever you enjoy, it's not really the thing you are > doing. It's the triggered feeling which can be obtained and > intensified more directly. We don't know exactly how those great > feelings (/qualia) work, but there is a number of chemicals and brain > regions known to play key roles. > > Regards, > Jiri Jelinek > > > On Nov 2, 2007 12:54 AM, Eliezer S. Yudkowsky <[EMAIL PROTECTED]> wrote: > > Jiri Jelinek wrote: > > > > > > Let's go to an extreme: Imagine being an immortal idiot.. No matter > > > what you do & how hard you try, the others will be always so much > > > better in everything that you will eventually become totally > > > discouraged or even afraid to "touch" anything because it would just > > > always demonstrate your relative stupidity (/limitations) in some way. > > > What a life. Suddenly, there is this amazing pleasure machine as a new > > > god-like-style of living for poor creatures like you. What do you do? > > > > Jiri, > > > > Is this really what you *want*? > > > > Out of all the infinite possibilities, this is the world in which you > > would most want to live? > > > > -- > > Eliezer S. Yudkowsky http://singinst.org/ > > Research Fellow, Singularity Institute for Artificial Intelligence > > > > ----- > > This list is sponsored by AGIRI: http://www.agiri.org/email > > To unsubscribe or change your options, please go to: > > http://v2.listbox.com/member/?& > > > > ----- > This list is sponsored by AGIRI: http://www.agiri.org/email > To unsubscribe or change your options, please go to: > http://v2.listbox.com/member/?& > -- Vladimir Nesov mailto:[EMAIL PROTECTED] ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=8660244&id_secret=60236618-350050
