Hi,

You've made two comments in two posts; I'll respond to them both together

1) that sociality may be necessary for spiritual joy to emerge in a mind

Response: Clearly sociality is one thing that can push a mind in the
direction of "appreciating its oneness with the universe" but I don't see
why it's the only thing that can do so....  I think the "basic intuitive
truths" underlying spiritual traditions can be recognized by ANY mind that
is self-aware and reflective, not just by a social mind.  For instance, if a
mind introspects into the way it constructs percepts, actions and objects --
the interpenetration of the "perceived" and "constructed" worlds -- then it
can be led down the path of grokking the harmony between the inner and outer
worlds, in a way that has nothing to do with sociality.

2) that sociality will lead to more intense emotions than asociality

Response: I don't think so.  I think that emotions are largely caused by the
experience of having one's mind-state controlled by "internal forces way
outside one's will"....  Now, in humans, some of these responses are
specifically induced by other humans or animals -- therefore some of our
emotions are explicitly social in nature.  But this doesn't imply that
emotions are necessarily social, nor that sociality is necessarily
emotional -- at least not in any obvious way that I can see....

I suppose you could try to construct an argument that sociality presents
computational problems that can ONLY be dealt with by mental subsystems that
operate in an automated way, outside of the scope of human will....
However, I don't at present believe this to be true...

-- Ben G



> -----Original Message-----
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
> Behalf Of Philip Sutton
> Sent: Sunday, February 22, 2004 9:27 AM
> To: [EMAIL PROTECTED]
> Subject: Re: [agi] AGI's and emotions
>
>
> Hi Ben,
>
> > Question: Will AGI's experience emotions like humans do?
> > Answer:
> > http://www.goertzel.org/dynapsyc/2004/Emotions.htm
>
> I'm wondering whether *social* organisms are likely to have a more
> active emotional life because inner psychological states need to be
> flagged physiologically to other organisms that need to be able to read
> their states.  This will also apply across species in the case of
> challenge
> and response situations (buzz off or I'll bite you, etc.).  Your
> point about
> the physiological states operating outside the mental processes (that
> are handled by the multiverse modeller) being likely to bring on feelings
> of emotion makes sense in a situation involving trans-entity
> communication.  It would be possible for physiologically flagged
> emotional states (flushed face/body, raised hackles, bared teeth snarl,
> broad grin, aroused sexual organs, etc.) to trigger a (pre-patterned?)
> response in another organism on an organism-wide decentralised basis
> - tying in with your idea that certain responses require a degree of
> speed that precludes centralised processing.
>
> So my guess would be that emotions in AIs would be more
> common/stronger if the AIs are *social* (ie. capable of relating to any
> other entitites ie. other AIs or with social biological entities)
> and they
> are able to both 'read' (and perhaps 'express/flag') psychological states
> - through 'body language' as well as verbal language.
>
> Maybe emotions, as humans experience them, are actually a muddled
> (and therefore interesting!?) hybrid of inner confusion in the multiverse
> modelling system and also a broad patterened communication system
> for projecting and reading *psychological states* where the reason(s)
> for the state is not communicated but the existence of the state is
> regarded (subconsciously?/pre-programmed?) by one or both of the
> parties in the communication as being important.
>
> Will AIs need to be able to share *psychological states* as opposed to
> detailed rational data with other AIs?  If AIs are to be good at
> communicating with humans, then chances are that the AIs will need to
> be able to convey some psychological states to humans since humans
> seem to want to be able to read this sort of information.
>
> Cheers, Philip
>
> -------
> To unsubscribe, change your address, or temporarily deactivate
> your subscription,
> please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
>

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to