The experience of "emotion," in my view, occurs when one component of a mind --which I call the "virtual multiverse modeler" and which is responsible for the feeling we call "free will" -- finds itself unable to construct models of large phenomena within the mind.  This can happen for several reasons.  One reason is that -- as often happens in humans -- large phenomena within the mind are driven by "primordial" brain subsystems that are opaque to the rational, modeling mind.  This will not occur in AGI's unless they're specifically designed that way.  Another reason is that there are very complex, unpredictable dynamics within the cognitive mind itself -- this source of emotion could occur within an AGI as well as (and perhaps better than in) humans.
 
So, I don't think it's useful to design AGI's specifically to have emotions -- unless one wants to build an AGI that has a specific lobe designed to experience rough emulations of human emotions, with the goal of making the AGI understand humans better.  However, I think that some sorts of emotions will necessarily arise in any intelligent system -- there's no way to avoid it because, given finite computational resources, there's no way to avoid a system experiencing major surprising internal events....  The only way to avoid emotion entirely would be to make a system entirely predictable by its own virtual multiverse modeler, but I'm pretty sure this is incompatible with general intelligence..
 
-- Ben G 
-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]On Behalf Of deering
Sent: Tuesday, February 24, 2004 2:16 AM
To: [EMAIL PROTECTED]
Subject: Re: [agi] AGI's and emotions

In your paper you take a stab at defining emotions and explaining different kinds of emotions' relationship to goals achievement and motivation of important behaviors (fight, flight, reproduction).  And then you go on to say that AI's will have goals and motivations and important behaviors, so of course, AI's will have emotions.  I don't exactly agree.
 
I think AI's could have emotions if they were designed that way.  I don't think this is the only way a mind can work.  I doubt if it is the best way.  Evolution gave feathers to birds, and feathers are certainly functional, but I don't think that is any excuse to be pasting them on the wings of an F16.  Emotions are evolution's solution to a motivational problem in biological minds.  I don't want my computer to stop sending my email because it is depressed about the economy.
 
Emotions...I don't know.  Maybe there are some applications where they might be useful, dealing with humans.  But then the emotions could be faked.  Humans do it all the time.  I'm trying to think of a case where real emotions would be a functional advantage to a purpose built machine.  I can't think of any.  Then again, it's late, and I have to get to bed.  I'll sleep on it.
 
 
Mike Deering.


To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to