I have skimmed many of the postings in this thread, and (although I have
not seen anyone say so) to a certain extent Jiri's positiion seems
somewhat similar to that in certain Eastern meditative traditions or
perhaps in certain Christian or other mystical "Blind Faiths."

I am not a particularly good meditator, but when I am having trouble
sleeping, I often try to meditate.  There are moments when I have rushes
of pleasure from just breathing, and times when a clear empty mind is
calming and peaceful.

I think such times are valuable.  I like most people would like more
moments of bliss in my life.  But I guess I am too much of a product of my
upbringing and education to want only bliss. I like to create things and
ideas.

And besides the notion of machines that could be trusted to run the world
for us while we seek to surf the endless rush and do nothing to help
support our own existence or that of the machines we would depend upon,
strikes me a nothing more than wishful thinking.  The biggest truism about
altruism is that it has never been the dominant motivation in any system
that has ever had it, and there is no reason to believe that it could
continue to be in machines for any historically long period of time.
Survival of the fittest applies to machines as well as biological life
forms.

If bliss without intelligence is the goal of the machines you imaging
running the world, for the cost of supporting one human they could
probably keep at least 100 mice in equal bliss, so if they were driven to
maximize bliss why wouldn't they kill all the grooving humans and replace
them with grooving mice.  It would provide one hell of a lot more bliss
bang for the resource buck.

Ed Porter


-----Original Message-----
From: Jiri Jelinek [mailto:[EMAIL PROTECTED]
Sent: Saturday, November 03, 2007 3:30 PM
To: [email protected]
Subject: Re: [agi] Nirvana? Manyana? Never!


On Nov 3, 2007 12:58 PM, Mike Dougherty <[EMAIL PROTECTED]> wrote:
> You are describing a very convoluted process of drug addiction.

The difference is that I have safety controls built into that scenario.

> If I can get you hooked on heroine or crack cocaine, I'm pretty
> confident that you will abandon your desire to produce AGI in order to
> get more of the drugs to which you are addicted.

Right. We are wired that way. Poor design.

> You mentioned in an earlier post that you expect to have this
> monstrous machine invade my world and 'offer' me these incredible
> benefits.  It sounds to me like you are taking the blue pill and
> living contentedly in the Matrix.

If the AGI that controls the Matrix sticks with the goal system initially
provided by the blue pill party then why would we want to sacrifice the
non-stop pleasure? Imagine you would get periodically unplugged to double
check if all goes well outside - over and over again finding (after
very-hard-to-do detailed investigation) that things go much better than
how would they likely go if humans were in charge. I bet your unplug
attitude would relatively soon change to something like "sh*t, not
again!".

> If you are going to proselytize
> that view, I suggest better marketing.  The intellectual requirements
> to accept AGI-driven nirvana imply the rational thinking which
> precludes accepting it.

I'm primarily a developer, leaving most of the marketing stuff to others
;-). What I'm trying to do here is to take a bit closer look at the human
goal system and investigate where it's likely to lead us. My impression is
that most of us have only very shallow understanding of what we really
want. When messing with AGI, we better know what we really want.

Regards,
Jiri Jelinek

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=60780377-9843bd

Reply via email to