In particular, emotions seem necessary (in humans) to a) provide goals,
b) provide pre-programmed constraints (for when logical reasoning doesn't
have enough information), and c) enforce urgency.


Agreed.

But I think that much of the particular flavor of emotions in humans comes
from their relative opacity to the deliberative mind... and this aspect will
not be there to anywhere near the same extent in a well-design AI.

So, IMO, it becomes a toss-up, whether to use the label "emotion" to
describe the emotion-analogues of an AI with transparent view into the
innards of its emotion-analogues...

-- Ben G

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to