On Tuesday 01 May 2007 14:06, Benjamin Goertzel wrote:
> >   In particular, emotions seem necessary (in humans) to a) provide goals,
> > b) provide pre-programmed constraints (for when logical reasoning doesn't
> > have enough information), and c) enforce urgency.
> ...
> So, IMO, it becomes a toss-up, whether to use the label "emotion" to
> describe the emotion-analogues of an AI with transparent view into the
> innards of its emotion-analogues...
>

It's probably worth pointing out in this connection the Schachter-Singer two 
factor theory of emotion: that there is a cognitive factor and a physical 
arousal (and that the physical arousal is THE SAME for all emotions). In 
other words, physical arousal provides the urgency but just what it's urgent 
to do is determined by a cognitive process not significantly different from 
any other. Furthermore, it is not uncommon for people to mistake the arousal 
from one cause for emotional urgency for another, merely because both happen 
at the same time.

(see http://en.wikipedia.org/wiki/Two_factor_theory_of_emotion)

Personally, I think that the use of the term emotion in AGI discussions clouds 
the issue. It is clearly not necessary for an AGI to have a physiological 
arousal that prepares the body for fight or flight. The role of arousal as a 
prioritizing mechanism is easily captured by any of a wide variety of 
well-understood heuristics used in operating systems. What the AGI then needs 
is *motivations*, which can flow in a straightforward way from explicit goal 
structures or from reinforcement learning.

Josh

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to