Linas, BillK

It might currently be hard to accept for association-based human
minds, but things like "roses", "power-over-others", "being worshiped"
or "loved" are just waste of time with indirect feeling triggers
(assuming the nearly-unlimited ability to optimize).

Regards,
Jiri Jelinek

On Nov 2, 2007 12:56 PM, Linas Vepstas <[EMAIL PROTECTED]> wrote:
> On Fri, Nov 02, 2007 at 12:41:16PM -0400, Jiri Jelinek wrote:
> > On Nov 2, 2007 2:14 AM, Eliezer S. Yudkowsky <[EMAIL PROTECTED]> wrote:
> > >if you could have anything you wanted, is this the end you
> > would wish for yourself, more than anything else?
> >
> > Yes. But don't forget I would also have AGI continuously looking into
> > how to improve my (/our) way of perceiving the pleasure-like stuff.
>
> This is a bizarre line of reasoning. One way that my AGI might improve
> my perception of pleasure is to make me dumber -- electroshock me --
> so that I find gilligan's island reruns incredibly pleasurable. Or,
> I dunno, find that heroin addiction is a great way to live.
>
> Or help me with fugue states: "what is the sound of one hand clapping?"
> feed me zen koans till my head explodes.
>
> But it might also decide that I should be smarter, so that I have a more
> acute sense and discernement of pleasure. Make me smarter about roses,
> so that I can enjoy my rose garden in a more refined way. And after I'm
> smarter, perhaps I'll have a whole new idea of what "pleasure" is,
> and what it takes to make me happy.
>
> Personally, I'd opt for this last possibility.
>
> --linas
>
> -----
> This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/?&;
>

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=60582722-508dcb

Reply via email to