Philip Sutton wrote:

> I guess we call emotions 'feelings' because we *feel *them - ie. we can
feel the effect they trigger in our whole body, detected via our internal monitoring of physical body condition.

Given this, unless AGIs are also programmed for thoughts or goal satisfactions to trigger 'physical' and/or other forms of systemic reaction, I suppose their emotions will have a lot less 'feeling' depth to them than humans and other biological species experience.

It seems to me an AI would not require emotions in order to have *motivations*.


Emotions may be necessary to provide a sense of self on the level we associate with human consciousness, however, I don't see that as being of much long term practical value, and more likely to be an impediment, in a practical AI or other highly advanced intelligence.

- Jef

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to