Hi Ben,

> Question: Will AGI's experience emotions like humans do?
> Answer:
> http://www.goertzel.org/dynapsyc/2004/Emotions.htm

I'm wondering whether *social* organisms are likely to have a more 
active emotional life because inner psychological states need to be 
flagged physiologically to other organisms that need to be able to read 
their states.  This will also apply across species in the case of challenge 
and response situations (buzz off or I'll bite you, etc.).  Your point about 
the physiological states operating outside the mental processes (that 
are handled by the multiverse modeller) being likely to bring on feelings 
of emotion makes sense in a situation involving trans-entity 
communication.  It would be possible for physiologically flagged 
emotional states (flushed face/body, raised hackles, bared teeth snarl, 
broad grin, aroused sexual organs, etc.) to trigger a (pre-patterned?) 
response in another organism on an organism-wide decentralised basis 
- tying in with your idea that certain responses require a degree of 
speed that precludes centralised processing.

So my guess would be that emotions in AIs would be more 
common/stronger if the AIs are *social* (ie. capable of relating to any 
other entitites ie. other AIs or with social biological entities) and they 
are able to both 'read' (and perhaps 'express/flag') psychological states 
- through 'body language' as well as verbal language.

Maybe emotions, as humans experience them, are actually a muddled 
(and therefore interesting!?) hybrid of inner confusion in the multiverse 
modelling system and also a broad patterened communication system 
for projecting and reading *psychological states* where the reason(s) 
for the state is not communicated but the existence of the state is 
regarded (subconsciously?/pre-programmed?) by one or both of the 
parties in the communication as being important.

Will AIs need to be able to share *psychological states* as opposed to 
detailed rational data with other AIs?  If AIs are to be good at 
communicating with humans, then chances are that the AIs will need to 
be able to convey some psychological states to humans since humans 
seem to want to be able to read this sort of information.

Cheers, Philip

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to