Jiri Jelinek wrote:
People will want to enjoy life: yes. And they should, of course.
But so, of course, will the AGIs.
Giving AGI the ability to enjoy = potentially asking for serious
trouble. Why shouldn't AGI just work for us like other tools we
currently have (no joy involved)?
Isn't there a fundamental contradiction in the idea of something that
can be a "tool" and also be "intelligent"? What I mean is, is the word
"tool" usable in this context?
To put it the other way around, consider the motivational system of the
best kind of AGI: it is motivated by a balanced set of desires that
include the desire to explore and learn, and empathy for the human
species. By definition, I would think, this simple cluster of desires
and empathic motivations *are* the things that "give it pleasure".
But the thing is, you can change your mind to go and get pleasure in a
different way sometimes. For example, you could decide to transfer your
mind into the cognitive system of an artificial tiger for a week, and
during that time you would get pleasure from stalking and jumping onto
predator animals, or basking in the sun, or meeting lady tigers. After
automatically being yanked back into human mental form again at the end
of the holiday, would you say that "you" get pleasure from hunting
predators, etc? Do you get pleasure from the idea of [exploring
different sensoria]? I think the latter would be true, and in the same
way an AGI, being quite close to us in design, could get pleasure from
[exploring different sensoria] without it changing the goals or
motivations of the AGI when it was being its native self.
I think that in general, making the AGI as similar to us as possible
(but without the aggressive and dangerous motivations that we are
victims of) would be a good idea simply because we want them to start
out with a strong empathy for us, and we want them to stay that way.
Does this make sense?
I agree that this is a complicated area, little explored before now.
Richard Loosemore
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=60744301-905c0f