Hi all, 
       
      I had read an article related to the discussion. I feel it could be of some 
inportance. 
http://www.firstscience.com/SITE/ARTICLES/love.asp

Regards, 
Nandakishor


On Wed, 25 Feb 2004 Ben Goertzel wrote :
>
>I don't claim that all unmonitored thought processes are emotional, of
>course
>
>I think that the most abstract description of emotion is "mental processes
>outside the scope of free will, resulting in widely-distributed effects
>across the mind, often correlated with physiological responses"
>
>How do you define "emotions", Mike?
>
>ben
>   -----Original Message-----
>   From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf
>Of deering
>   Sent: Tuesday, February 24, 2004 3:08 PM
>   To: [EMAIL PROTECTED]
>   Subject: Re: [agi] AGI's and emotions
>
>
>   It is true that there is a portion of the process of emotion that is not
>under our conscious control.  There are in fact many cognitive functions
>underlying lots of different conscious thoughts that are not subject to our
>introspection or direct control, though perhaps not beyond our
>understanding.  We necessarily have limited ability to watch our own thought
>processes, in order to have time to think about the important stuff, and to
>avoid an infinite regress.  This limitation is "hardwired" in our design.
>The ability to selectively observe and control any cognitive function is a
>possible design option in an AI.  The fact that there will not be time or
>resources to monitor every mental process, that most will be automatic, does
>not make it emotion.  Lack of observation, and lack of control, do not mean
>lack of understanding.
>
>   I agree that there will necessarily be automatic functions in a practical
>mind.  I don't agree that these processes have to be characterized or shaped
>as emotions.  I expect to see emotional AI's and non-emotional AI's.  We
>don't know enough yet to predict which will function better.
>
>   1.  highly emotional AL.  (out of control)
>
>   2.  moderately emotional AI.  (like us, undependable)
>
>   3.  slightly emotional AI.  (your supposition, possibly good)
>
>   4.  non-emotional AI.  (my choice, including simulated emotions for human
>interaction)
>
>
>   Mike Deering.
>
>
>
>
>----------------------------------------------------------------------------
>--
>   To unsubscribe, change your address, or temporarily deactivate your
>subscription, please go to
>http://v2.listbox.com/member/[EMAIL PROTECTED]
>
>-------
>To unsubscribe, change your address, or temporarily deactivate your subscription,
>please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to