RE: [agi] AGI's and emotions

2004-03-02 Thread Gus Constan









Emotion is not sensory data but rather a
product of it, from the machine point of view emotion is another reasoning
faculty invoked from archetypal imprints adjusting to a sensory cognitive
pattern (the resolution process). Emotion is the steering heuristic encapsulating
the resolution domain. One may say its the seed of reason
or at least the path it traces.



Gus





-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf
Of Ben Goertzel
Sent: Wednesday, February 25, 2004
11:25 AM
To: [EMAIL PROTECTED]
Subject: RE: [agi] AGI's and
emotions











Mike,











Regarding your definition
of emotion. Ialmost agree with what you say -- BUT, I think you're
missing a basic point. Emotions do involve data coming into the cognitive
centers, vaguely similarly to how perceptual data comes into the cognitive centers.
And, as with perception, emotions involve processing that goes on in areas of
the brain that are mostly opaque to the cognitive centers. But in the
case of emotion, the data comes in from a broadly distributed set of
physiological and kinesthetic indicators -- AND from parts of the brain that
are concerned with reaction to stimuli and goal-achievement rather than just
perceiving. This is qualitatively different than data feeding in from
sensors Emotions are more similar to unconscious reflex actions than
to sensation per se -- but they last longer and are more broadly-based than
simple reflex actions...











ben g





-Original
Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]On Behalf Of
deering
Sent: Wednesday, February 25, 2004
2:19 AM
To: [EMAIL PROTECTED]
Subject: Re: [agi] AGI's and
emotions



Bill, I agree with you that emotions
are tied to motivation of behavior in humans. Humans prefer the
experience of some emotions and avoid the experience of others, and therefore
generate their behavior to maximize these goals. I think this is a
peculiarly biological situation and need now be replicated in AI's. I
think in AI's we have the design option to base the motivation of behavior on
more rational grounds.

















Ben, I don't know if my personal
definition of emotions will be of much help as it may not be shared by a very
large community. but for what it's worth, here it is.











MIKE DEERING'S PERSONAL DEFINITION
OF EMOTIONS: Emotions are a kind of sensory data. The sensory organ
that perceives this data is the conscious mind alone. The physical
reality which generates this raw data are selected concentrations of
neurotransmitters in the brain. Their effects vary with different types
of neurons in different locations. Some types of neurons produce more of
certain kinds of neurotransmitter than other types of neurons. Those that
generate the neurotransmitters are not necessarily the same as those that are
more affected. They are also affected by other chemicalsproduced by
glands. It's complicated. These neurochemical phenomena are by
evolutionary design causally linked to environmental circumstances and divided
into positive and negative type. They are used, by evolutionary design,
to positively and negatively reinforce behaviors to maximize and minimize the
related circumstances. Emotions are not products of cognitive processes
but are rather perceptions of neurochemical states and states of activation of
selected regions of the brain. Because of the complicated feedback
arrangements in the generation of neurotransmitters and hormones, and the
neurons role in this feedback, some limited conscious influence can be
exercised in the management of emotions. Emotions can be generated
artificially by the introduction of various chemicals to the brain, the direct
electrical stimulation of certain neuron clusters, or direct control of
environmental circumstances. Certain physical bodily sensations are
closely related to emotions: pain to sadness, pleasure to happiness.





















To unsubscribe, change your address, or temporarily
deactivate your subscription, please go to
http://v2.listbox.com/member/[EMAIL PROTECTED] 









To unsubscribe, change your address, or temporarily
deactivate your subscription, please go to
http://v2.listbox.com/member/[EMAIL PROTECTED]






To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]






Re: [agi] AGI's and emotions

2004-02-26 Thread Kevin
Title: Message



It's 
true that nearly all thoughts have some physiological/primordial-brain 
associations, but in some cases (the ones we call "emotions") these associations 
are the DOMINANT part of the thought/experience, whereas in other cases they're 
only a minor aspect...
---
Sure, but it's 
still all just "thoughts", which was my main point...I'll also note that these 
associations are highly variable across humans. Certain humans are able to 
have a high degree of control over the mind/body. Also, I can't really 
agree that these "associations are the dominant part of the 
thought/experience". They may be the dominant part of the experience, but 
the thought proceeds the experience (by a very smallincrement of time, yet 
discernable to the trained mind)and is therefore 
thecontroller. So the thought itself is the dominant 
factor.

--Kevin

  
  It 
  is interesting if Tibetans don't make the distinction between thought and 
  emotion so crisply as we do. Of course, I'm sure there are many things 
  they distinguish that we don't habitually distinguish, as well. 
  Different cultural systems divide up the world in different ways, as we all 
  know...
  
  That's true 
  Ben, but tibetan minds are the same as western minds in essence. And 
  they've got a few thousand years of culture built around the understanding of 
  the fundamental nature of mind, so their opinion is more valid than 
  most. 
  
  --Kevin


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




RE: [agi] AGI's and emotions

2004-02-26 Thread Ben Goertzel
Title: Message




I 
disagree that thought necessarily precedes experience.I am defining 
"thought" here as related to cognitive rather than purely physiological 
activity. I think that in many cases, physiological reactions drive 
cognitive activity rather than vice versa. Perhaps you are defining 
"thought" differently, though.

-- Ben 
G

  -Original Message-From: [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED]On Behalf Of KevinSent: 
  Thursday, February 26, 2004 12:59 PMTo: 
  [EMAIL PROTECTED]Subject: Re: [agi] AGI's and 
  emotions
  It's 
  true that nearly all thoughts have some physiological/primordial-brain 
  associations, but in some cases (the ones we call "emotions") these 
  associations are the DOMINANT part of the thought/experience, whereas in other 
  cases they're only a minor aspect...
  ---
  Sure, but it's 
  still all just "thoughts", which was my main point...I'll also note that these 
  associations are highly variable across humans. Certain humans are able 
  to have a high degree of control over the mind/body. Also, I can't 
  really agree that these "associations are the dominant part of the 
  thought/experience". They may be the dominant part of the experience, 
  but the thought proceeds the experience (by a very smallincrement of 
  time, yet discernable to the trained mind)and is therefore 
  thecontroller. So the thought itself is the dominant 
  factor.
  
  --Kevin
  

It 
is interesting if Tibetans don't make the distinction between thought and 
emotion so crisply as we do. Of course, I'm sure there are many things 
they distinguish that we don't habitually distinguish, as well. 
Different cultural systems divide up the world in different ways, as we all 
know...

That's true 
Ben, but tibetan minds are the same as western minds in essence. And 
they've got a few thousand years of culture built around the understanding 
of the fundamental nature of mind, so their opinion is more valid than 
most. 

--Kevin
  
  
  To unsubscribe, change your address, or temporarily deactivate your 
  subscription, please go to 
  http://v2.listbox.com/member/[EMAIL PROTECTED]
  


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




Re: [agi] AGI's and emotions

2004-02-26 Thread G71AI
I just spent 10 minutes trying to figure out a definition of emotion for the purpose 
of AI.
Here is the thought:
http://www.mageo.com/home/GEORGE_71/index.html?g71p=define.html#emotion

Sincerely,
Jiri Jelinek

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] AGI's and emotions

2004-02-26 Thread Ben Goertzel
Title: Message




It's 
true that nearly all thoughts have some physiological/primordial-brain 
associations, but in some cases (the ones we call "emotions") these associations 
are the DOMINANT part of the thought/experience, whereas in other cases they're 
only a minor aspect...

It is 
interesting if Tibetans don't make the distinction between thought and emotion 
so crisply as we do. Of course, I'm sure there are many things they 
distinguish that we don't habitually distinguish, as well. Different 
cultural systems divide up the world in different ways, as we all 
know...

-- Ben 
G

  -Original Message-From: [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED]On Behalf Of KevinSent: 
  Wednesday, February 25, 2004 6:53 PMTo: 
  [EMAIL PROTECTED]Subject: Re: [agi] AGI's and 
  emotions
  I'll add one last point here..the Dalai Lama, 
  when talking with western intelligenicia from various disciplines at Harvard ( 
  I think it was Harvard) was asked a question about emotions. He got a 
  very puzzled look on his face. It turned out that the Tibetans, due to 
  their study of the mind, made no distinction between ordinary thought and 
  emotion. So the idea of "emotion" being separate from thought was 
  completely foreign to HHDL..
  
  My own experience tells me that *all* thoughts carry a 
  physiolocial component..there is no separation between the body and mind in 
  this sense. It's just that most thoughts affect on our physiology flies 
  under the radar of our everyday awareness...So we only really notice the major 
  emotions/thoughts due to this kind of numbness. But the accumulation of 
  physiological responses from subtle negative thinking can have a very 
  profoundly bad effect on us over time...I think an AGI will also need to watch 
  these subtle accumulations..
  
  --Kevin
  
- Original Message - 
From: 
J. 
W. Johnston 
To: [EMAIL PROTECTED] 
Sent: Wednesday, February 25, 2004 5:36 
    PM
    Subject: RE: [agi] AGI's and 
emotions

Folks interested in this thread should check out the draft of Marvin 
Minsky's upcoming book "The Emotion Machine". Been available at his web site 
for quite some time:
http://web.media.mit.edu/~minsky/

The current draft doesn't seem to have an executive summary that lays 
outthemain thesis, but in a 12/13/99 posting (http://www.generation5.org/content/1999/minsky.asp), 
Minsky says:

The central idea is that emotion is not 
different from thinking. Instead, each emotion is a type or arrangement of 
thinking. There is no such thing as unemotional thinking, because there 
always must be a selection of goals, and a selection of resources for 
achieving them. 


From my notesafter skimming some of the book about a year ago, 
it seemed thatMinsky sees emotions as kinds of "presets" (his term - 
"Selectors") that determine what mind resources and goals are active at a 
given time to solve a particular "problem". [I seem to recall Antonio 
Damasio also had a similar conception... and he called the emotional "set 
points" PATTERNS!]

The following isfrom the draft of Chapter 1 Section 
6:


Each 
of our major emotional states results from ‘switching’ the set of resources 
in use—by turning certain ones on and other ones 
off. Any such 
change will affect how we think, by changing our brain’s 
activities.

In other 
words, our emotional states are not separate and distinct from thoughts; 
instead, each one is a different way to think.



For example, when an emotion 
like Anger ‘takes over,’ you 
abandon some of your ways to make plans. You turn off some safety-defenses. 
You replace some of your slower-acting resources with ones that tend to more 
quickly react—and to do with more speed and strength. You trade empathy for 
hostility, change cautiousness into aggressiveness, and give less thought to 
the consequences. And then it may seem (to both you and your friends) that 
you’ve switched to a new personality.

Good stuff! (IMHO)

J. 
W. Johnston

  
  -Original Message-From: 
  [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of 
  Ben GoertzelSent: Wednesday, February 25, 2004 11:25 
      AMTo: [EMAIL PROTECTED]Subject: RE: [agi] AGI's and 
  emotions
  
  Agreed --- we tend to project even abstract experiences back down 
  to our physical layer, and then react to them physically ... a kind of 
  analogy that AGI's are unlikely to pursue so avidly unless specifically 
  designed to do so
  
  ben g
  
-Original Message-From: 
[EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]On Behalf 
Of Philip SuttonSent: Wednesday, February 2

RE: [agi] AGI's and emotions

2004-02-25 Thread Ben Goertzel

Bill,

I think that emotions in humans are CORRELATED with value-judgments, but are
certainly not identical to them.

We can have emotions that are ambiguous in value, and we can have strong
value judgments with very little emotion attached to them.

-- Ben G


  Bill, I agree with you that emotions are tied to
  motivation of behavior in humans.  Humans prefer the
  experience of some emotions and avoid the experience of
  others, and therefore generate their behavior to maximize
  these goals.  I think this is a peculiarly biological
  situation and need now be replicated in AI's.  I think in
  AI's we have the design option to base the motivation of
  behavior on more rational grounds.

 I would say that behavior of any intelligence must be
 motivated by values for distinguishing good and bad
 outcomes, and that emotion is essentially just a
 word we use for those values in humans. Of course, an
 AI need not express its values as humans do, through
 facial expressions, body language, and tone of voice.
 If an AI needs to communicate with humans, a way of
 mimicking human emotional expressions will be useful
 for that communication.

 Cheers,
 Bill

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] AGI's and emotions

2004-02-25 Thread Bill Hibbard
Ben,

 I think that emotions in humans are CORRELATED with value-judgments, but are
 certainly not identical to them.

 We can have emotions that are ambiguous in value, and we can have strong
 value judgments with very little emotion attached to them.

That is reasonable. As I said in my first post on this topic,
there is variation in the way people define emotion. The
quotes from Edelman and Crick show some precedence for
defining emotion essentially as value, but it is also common
to define emotion more in terms of expression or physiological
response.

Cheers,
Bill

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


p.s., RE: [agi] AGI's and emotions

2004-02-25 Thread Bill Hibbard
I said:

 That is reasonable. As I said in my first post on this topic,
 there is variation in the way people define emotion. The
 quotes from Edelman and Crick show some precedence for
 defining emotion essentially as value, but it is also common
 to define emotion more in terms of expression or physiological

Another definition of emotion may be in terms of qualia.

Bill

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] AGI's and emotions

2004-02-25 Thread Brad Wyble
On Wed, 25 Feb 2004, Ben Goertzel wrote:

 
 Emotions ARE thoughts but they differ from most thoughts in the extent to
 which they involve the primordial brain AND the non-neural physiology of
 the body as well.  This non-brain-centricity means that emotions are more
 out of 'our' control than most thoughts, where 'our' refers to the
 modeling center of the brain that we associate with the feeling of 'free
 will.'
 
 -- Ben G
 

I would agree with this.   Emotions seem to arise from parts of the brain 
that your central executive has minimal control over.  They can be 
suppressed and manipulated with effort but they are distinct 
from the character of thoughts originating in other parts of the brain.  

It's probably a mistake to characterize emotions as a unitary phenomenon 
though.  Different emotions have different functions, and likely originate 
from different structures. 

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] AGI's and emotions

2004-02-25 Thread Brad Wyble

 I guess we call emotions 'feelings' because we feel them - ie. we can 
 feel the effect they trigger in our whole body, detected via our internal 
 monitoring of physical body condition.
 
 Given this, unless AGIs are also programmed for thoughts or goal 
 satisfactions to trigger 'physical' and/or other forms of systemic 
 reaction, I suppose their emotions will have a lot less 'feeling' depth to 
 them than humans and other biological species experience.
 

That's not the entirety of the difference between emotions and other types 
of thoughts.  A reasoning entity can detect that their thoughts are under 
the influence of an emotion.  For example, consider being in a road rage 
situation, which I'm sure we can all relate to.  

You know full well that 
your reaction of anger towards someone who's unwittingly committed a 
minor offense to you is wildly irrational and yet you can't help but feel 
a flash of extreme animosity towards someone else (or maybe your steering 
wheel :)).  The fact that you know it's an emotional 
reaction doesn't prevent you from feeling its effects on your thoughts, it 
just lets you handle it without acting on it.

So any entity capable of remembering their thought processes would be able 
to detect the influence of an emotion (at least the human variety) on 
the current flow of their thoughts even without body-state markers.  
-Brad

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] AGI's and emotions

2004-02-25 Thread Jef Allbright
Philip Sutton wrote:

  I guess we call emotions 'feelings' because we *feel *them - ie. we can
feel the effect they trigger in our whole body, detected via our 
internal monitoring of physical body condition.

Given this, unless AGIs are also programmed for thoughts or goal 
satisfactions to trigger 'physical' and/or other forms of systemic 
reaction, I suppose their emotions will have a lot less 'feeling' depth 
to them than humans and other biological species experience.
It seems to me an AI would not require emotions in order to have 
*motivations*.

Emotions may be necessary to provide a sense of self on the level we 
associate with human consciousness, however, I don't see that as being 
of much long term practical value, and more likely to be an impediment, 
in a practical AI or other highly advanced intelligence.

- Jef

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] AGI's and emotions

2004-02-25 Thread Ben Goertzel




Mike,

Regarding your definition of emotion. Ialmost agree with what 
you say -- BUT, I think you're missing a basic point. Emotions do involve 
data coming into the cognitive centers, vaguely similarly to how perceptual data 
comes into the cognitive centers. And, as with perception, emotions 
involve processing that goes on in areas of the brain that are mostly opaque to 
the cognitive centers. But in the case of emotion, the data comes in from 
a broadly distributed set of physiological and kinesthetic indicators -- AND 
from parts of the brain that are concerned with reaction to stimuli and 
goal-achievement rather than just perceiving. This is qualitatively 
different than data feeding in from sensors Emotions are more similar 
to unconscious reflex actions than to sensation per se -- but they last longer 
and are more broadly-based than simple reflex actions...

ben 
g

  -Original Message-From: [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED]On Behalf Of deeringSent: 
  Wednesday, February 25, 2004 2:19 AMTo: 
  [EMAIL PROTECTED]Subject: Re: [agi] AGI's and 
  emotions
  Bill, I agree with you that emotions are tied to 
  motivation of behavior in humans. Humans prefer the experience of some 
  emotions and avoid the experience of others, and therefore generate their 
  behavior to maximize these goals. I think this is a peculiarly 
  biological situation and need now be replicated in AI's. I think in AI's 
  we have the design option to base the motivation of behavior on more rational 
  grounds.
  
  
  Ben, I don't know if my personal definition of 
  emotions will be of much help as it may not be shared by a very large 
  community. but for what it's worth, here it is.
  
  MIKE DEERING'S PERSONAL DEFINITION OF 
  EMOTIONS: Emotions are a kind of sensory data. The sensory organ 
  that perceives this data is the conscious mind alone. The physical 
  reality which generates this raw data are selected concentrations of 
  neurotransmitters in the brain. Their effects vary with different types 
  of neurons in different locations. Some types of neurons produce more of 
  certain kinds of neurotransmitter than other types of neurons. Those 
  that generate the neurotransmitters are not necessarily the same as those that 
  are more affected. They are also affected by other 
  chemicalsproduced by glands. It's complicated. These 
  neurochemical phenomena are by evolutionary design causally linked to 
  environmental circumstances and divided into positive and negative type. 
  They are used, by evolutionary design, to positively and negatively reinforce 
  behaviors to maximize and minimize the related circumstances. Emotions 
  are not products of cognitive processes but are rather perceptions of 
  neurochemical states and states of activation of selected regions of the 
  brain. Because of the complicated feedback arrangements in the 
  generation of neurotransmitters and hormones, and the neurons role in this 
  feedback, some limited conscious influence can be exercised in the management 
  of emotions. Emotions can be generated artificially by the introduction 
  of various chemicals to the brain, the direct electrical stimulation of 
  certain neuron clusters, or direct control of environmental 
  circumstances. Certain physical bodily sensations are closely related to 
  emotions: pain to sadness, pleasure to happiness.
  
  
  
  
  To unsubscribe, change your address, or temporarily deactivate your 
  subscription, please go to 
  http://v2.listbox.com/member/[EMAIL PROTECTED]
  


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




RE: [agi] AGI's and emotions

2004-02-25 Thread Ben Goertzel




Agreed 
--- we tend to project even abstract experiences back down to our physical 
layer, and then react to them physically ... a kind of analogy that AGI's are 
unlikely to pursue so avidly unless specifically designed to do 
so

ben 
g

  -Original Message-From: [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED]On Behalf Of Philip 
  SuttonSent: Wednesday, February 25, 2004 12:00 PMTo: 
  [EMAIL PROTECTED]Subject: RE: [agi] AGI's and 
  emotions
   Emotions ARE thoughts but they differ from most 
  thoughts in the extent
   to which they involve the "primordial" brain 
  AND the non-neural
   physiology of the body as well. 
  
  
  I guess we call 
  emotions 'feelings' because we feel them - ie. we can feel the effect 
  they trigger in our whole body, detected via our internal monitoring of 
  physical body condition.
  
  Given this, 
  unless AGIs are also programmed for thoughts or goal satisfactions to trigger 
  'physical' and/or other forms of systemic reaction, I suppose their emotions 
  will have a lot less 'feeling' depth to them than humans and other biological 
  species experience.
  
  Cheers, 
  Philip
  
  
  
  To unsubscribe, change your address, or temporarily deactivate your 
  subscription, please go to 
  http://v2.listbox.com/member/[EMAIL PROTECTED]
  


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




RE: [agi] AGI's and emotions

2004-02-25 Thread J. W. Johnston
Title: Message



Folks 
interested in this thread should check out the draft of Marvin Minsky's upcoming 
book "The Emotion Machine". Been available at his web site for quite some 
time:
http://web.media.mit.edu/~minsky/

The 
current draft doesn't seem to have an executive summary that lays 
outthemain thesis, but in a 12/13/99 posting (http://www.generation5.org/content/1999/minsky.asp), 
Minsky says:

The central idea is that emotion is not 
different from thinking. Instead, each emotion is a type or arrangement of 
thinking. There is no such thing as unemotional thinking, because there always 
must be a selection of goals, and a selection of resources for achieving them. 



From 
my notesafter skimming some of the book about a year ago, it seemed 
thatMinsky sees emotions as kinds of "presets" (his term - "Selectors") 
that determine what mind resources and goals are active at a given time to solve 
a particular "problem". [I seem to recall Antonio Damasio also had a similar 
conception... and he called the emotional "set points" 
PATTERNS!]

The 
following isfrom the draft of Chapter 1 Section 6:


Each of 
our major emotional states results from switching the set of resources in 
useby turning certain ones on and other ones off. Any such change will affect how we think, by 
changing our brains activities.

In other 
words, our emotional states are not separate and distinct from thoughts; 
instead, each one is a different way to think.



For example, when an emotion like 
Anger takes over, you abandon some 
of your ways to make plans. You turn off some safety-defenses. You replace some 
of your slower-acting resources with ones that tend to more quickly reactand to 
do with more speed and strength. You trade empathy for hostility, change 
cautiousness into aggressiveness, and give less thought to the consequences. And 
then it may seem (to both you and your friends) that youve switched to a new 
personality.

Good 
stuff! (IMHO)

J. W. 
Johnston

  
  -Original Message-From: 
  [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of 
  Ben GoertzelSent: Wednesday, February 25, 2004 11:25 
  AMTo: [EMAIL PROTECTED]Subject: RE: [agi] AGI's and 
  emotions
  
  Agreed --- we tend to project even abstract experiences back down to 
  our physical layer, and then react to them physically ... a kind of analogy 
  that AGI's are unlikely to pursue so avidly unless specifically designed to do 
  so
  
  ben 
  g
  
-Original Message-From: [EMAIL PROTECTED] 
[mailto:[EMAIL PROTECTED]On Behalf Of Philip 
SuttonSent: Wednesday, February 25, 2004 12:00 PMTo: 
[EMAIL PROTECTED]Subject: RE: [agi] AGI's and 
emotions
 Emotions ARE thoughts but they differ from most 
thoughts in the extent
 to which they involve the "primordial" brain 
AND the non-neural
 physiology of the body as well. 


I guess we 
call emotions 'feelings' because we feel them - ie. we can feel the 
effect they trigger in our whole body, detected via our internal monitoring 
of physical body condition.

Given this, 
unless AGIs are also programmed for thoughts or goal satisfactions to 
trigger 'physical' and/or other forms of systemic reaction, I suppose their 
emotions will have a lot less 'feeling' depth to them than humans and other 
biological species experience.

Cheers, 
Philip



To unsubscribe, change your address, or temporarily deactivate your 
subscription, please go to 
http://v2.listbox.com/member/[EMAIL PROTECTED] 

  
  
  To unsubscribe, change your address, or temporarily deactivate your 
  subscription, please go to 
  http://v2.listbox.com/member/[EMAIL PROTECTED]
  


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




Re: [agi] AGI's and emotions

2004-02-25 Thread Kevin
Title: Message



I'll add one last point here..the Dalai Lama, 
when talking with western intelligenicia from various disciplines at Harvard ( I 
think it was Harvard) was asked a question about emotions. He got a very 
puzzled look on his face. It turned out that the Tibetans, due to their 
study of the mind, made no distinction between ordinary thought and 
emotion. So the idea of "emotion" being separate from thought was 
completely foreign to HHDL..

My own experience tells me that *all* thoughts carry a 
physiolocial component..there is no separation between the body and mind in this 
sense. It's just that most thoughts affect on our physiology flies under 
the radar of our everyday awareness...So we only really notice the major 
emotions/thoughts due to this kind of numbness. But the accumulation of 
physiological responses from subtle negative thinking can have a very profoundly 
bad effect on us over time...I think an AGI will also need to watch these subtle 
accumulations..

--Kevin

  - Original Message - 
  From: 
  J. W. 
  Johnston 
  To: [EMAIL PROTECTED] 
  Sent: Wednesday, February 25, 2004 5:36 
  PM
  Subject: RE: [agi] AGI's and 
  emotions
  
  Folks interested in this thread should check out the draft of Marvin 
  Minsky's upcoming book "The Emotion Machine". Been available at his web site 
  for quite some time:
  http://web.media.mit.edu/~minsky/
  
  The 
  current draft doesn't seem to have an executive summary that lays 
  outthemain thesis, but in a 12/13/99 posting (http://www.generation5.org/content/1999/minsky.asp), 
  Minsky says:
  
  The central idea is that emotion is not 
  different from thinking. Instead, each emotion is a type or arrangement of 
  thinking. There is no such thing as unemotional thinking, because there always 
  must be a selection of goals, and a selection of resources for achieving them. 
  
  
  
  From 
  my notesafter skimming some of the book about a year ago, it seemed 
  thatMinsky sees emotions as kinds of "presets" (his term - "Selectors") 
  that determine what mind resources and goals are active at a given time to 
  solve a particular "problem". [I seem to recall Antonio Damasio also had a 
  similar conception... and he called the emotional "set points" 
  PATTERNS!]
  
  The 
  following isfrom the draft of Chapter 1 Section 6:
  
  
  Each 
  of our major emotional states results from ‘switching’ the set of resources in 
  use—by turning certain ones on and other ones off. Any such change will affect how we think, by 
  changing our brain’s activities.
  
  In other 
  words, our emotional states are not separate and distinct from thoughts; 
  instead, each one is a different way to think.
  
  
  
  For example, when an emotion like 
  Anger ‘takes over,’ you abandon 
  some of your ways to make plans. You turn off some safety-defenses. You 
  replace some of your slower-acting resources with ones that tend to more 
  quickly react—and to do with more speed and strength. You trade empathy for 
  hostility, change cautiousness into aggressiveness, and give less thought to 
  the consequences. And then it may seem (to both you and your friends) that 
  you’ve switched to a new personality.
  
  Good 
  stuff! (IMHO)
  
  J. 
  W. Johnston
  

-Original Message-From: 
[EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of 
Ben GoertzelSent: Wednesday, February 25, 2004 11:25 
    AMTo: [EMAIL PROTECTED]Subject: RE: [agi] AGI's and 
emotions

Agreed --- we tend to project even abstract experiences back down to 
our physical layer, and then react to them physically ... a kind of analogy 
that AGI's are unlikely to pursue so avidly unless specifically designed to 
do so

ben g

  -Original Message-From: [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED]On Behalf Of Philip 
  SuttonSent: Wednesday, February 25, 2004 12:00 PMTo: 
  [EMAIL PROTECTED]Subject: RE: [agi] AGI's and 
  emotions
   Emotions ARE thoughts but they differ from 
  most thoughts in the extent
   to which they involve the "primordial" brain 
  AND the non-neural
   physiology of the body as well. 
  
  
  I guess we 
  call emotions 'feelings' because we feel them - ie. we can feel the 
  effect they trigger in our whole body, detected via our internal 
  monitoring of physical body condition.
  
  Given this, 
  unless AGIs are also programmed for thoughts or goal satisfactions to 
  trigger 'physical' and/or other forms of systemic reaction, I suppose 
  their emotions will have a lot less 'feeling' depth to them than humans 
  and other biological species experience.
  
  Cheers, 
  Philip
  
  
  
  To unsubscribe, change your address, or temporarily deactivate your 
  subscript

RE: [agi] AGI's and emotions

2004-02-24 Thread Ben Goertzel




The 
experience of "emotion," in my view, occurs when one component of a mind --which 
I call the "virtual multiverse modeler" and which is responsible for the feeling 
we call "free will" -- finds itself unable to construct models of large 
phenomena within the mind. This can happen for several reasons. One 
reasonis that -- as often happens in humans -- large phenomena within the 
mind are driven by "primordial" brain subsystems that are opaque to the 
rational, modeling mind. This will not occur in AGI's unless they're 
specifically designed that way. Another reason is that there are very 
complex, unpredictable dynamics within the cognitive mind itself -- this source 
of emotion could occur within an AGI as well as (and perhaps better than in) 
humans.

So, I 
don't think it's useful to design AGI's specifically to have emotions -- unless 
one wants to build an AGI that has a specific lobe designed to experience rough 
emulations ofhuman emotions, with the goal of making the AGI understand 
humans better. However, I think that some sorts of emotions will 
necessarily arise in any intelligent system -- there's no way to avoid it 
because, given finite computational resources, there's no way to avoid a system 
experiencing major surprising internal events The only way to avoid 
emotion entirely would be to make a system entirely predictable by its own 
virtual multiverse modeler, but I'm pretty sure this is incompatible with 
general intelligence..

-- Ben 
G

  -Original Message-From: [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED]On Behalf Of deeringSent: 
  Tuesday, February 24, 2004 2:16 AMTo: 
  [EMAIL PROTECTED]Subject: Re: [agi] AGI's and 
  emotions
  In your paper you take a stab at defining 
  emotions and explaining different kinds of emotions' relationship to goals 
  achievement and motivation of important behaviors (fight, flight, 
  reproduction). And then you go on to say that AI's will have goals and 
  motivations and important behaviors, so of course, AI's will have 
  emotions. I don't exactly agree.
  
  I think AI's could have emotions if they were 
  designed that way. I don't think this is the only way a mind can 
  work. I doubt if it is the best way. Evolution gave feathers to 
  birds, and feathers are certainly functional, but I don't think that is any 
  excuse to be pasting them on the wings of an F16. Emotions are 
  evolution's solution to a motivational problem in biological minds. I 
  don't want my computer to stop sending my email because it is depressed about 
  the economy.
  
  Emotions...I don't know. Maybe there are 
  some applications where they might be useful, dealing with humans. But 
  then the emotions could be faked. Humans do it all the time. I'm 
  trying to think of a case where real emotions would be a functional advantage 
  to a purpose built machine. I can't think of any. Then again, it's 
  late, and I have to get to bed. I'll sleep on it.
  
  
  Mike Deering.
  
  
  To unsubscribe, change your address, or temporarily deactivate your 
  subscription, please go to 
  http://v2.listbox.com/member/[EMAIL PROTECTED]
  


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




Re: [agi] AGI's and emotions

2004-02-24 Thread deering



An unexpected mental event or an unplanned mental 
excursion does not in itself constitute an emotion. An epileptic seizure 
is not an emotion. Most emotions, perhaps all, are very predictable from 
causes. You will the lottery or the girl next door says "yes" and you are 
happy. Someone runs into your classic Beetle, and you are sad. You 
finish a major work of great value, and you feel joy. There is nothing 
mysterious about these emotions, no unpredictable mental dynamics. I don't 
consider "confusion" an emotion. I consider it a error in 
processing. I know I'm not telling you anything new. You surely 
understand all of this already. Therefore I must be missing some 
fundamental aspect of your thoughts on emotions. I have to admit, I've 
never been very good at emotions, and tend to ignore them. I feel like we 
must be talking past each other, but I can't imagine how we could be ambiguous 
about an experience as fundamental as emotion. We all have them. 
It's the ocean our thoughts swim in, waves taking us to and fro, and sometimes 
crashing us against the rocks.


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




RE: [agi] AGI's and emotions

2004-02-24 Thread Ben Goertzel




It's 
true that emotional reactions are often predictable on the medium scale -- yeah, 
I can predict that I'll get angry if you hit my wife on the head, or happy if 
you give me a billion dollar check

However, from the point of view of the cognitive mind (in particular the 
decision-making part of the mind, which we associate with "free will"), emotions 
correspond to activity that was neither

a) 
driven mainlyby cognitive activity, nor
b) 
driven mainly by external events

Cognition of external events may *trigger* emotional experiences, but the 
dynamics of the experiences themselves are controlled by the mammalian and 
reptilian brain, not by the cognitive brain nor by the external world 
This is why there is no "free will" feeling attached to these experiences, 
unlike the case with experiences driven more thoroughly and directly by the 
cognitive brain.

-- 
Ben


  -Original Message-From: [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED]On Behalf Of deeringSent: 
  Tuesday, February 24, 2004 11:09 AMTo: 
  [EMAIL PROTECTED]Subject: Re: [agi] AGI's and 
  emotions
  An unexpected mental event or an unplanned mental 
  excursion does not in itself constitute an emotion. An epileptic seizure 
  is not an emotion. Most emotions, perhaps all, are very predictable from 
  causes. You will the lottery or the girl next door says "yes" and you 
  are happy. Someone runs into your classic Beetle, and you are sad. 
  You finish a major work of great value, and you feel joy. There is 
  nothing mysterious about these emotions, no unpredictable mental 
  dynamics. I don't consider "confusion" an emotion. I consider it a 
  error in processing. I know I'm not telling you anything new. You 
  surely understand all of this already. Therefore I must be missing some 
  fundamental aspect of your thoughts on emotions. I have to admit, I've 
  never been very good at emotions, and tend to ignore them. I feel like 
  we must be talking past each other, but I can't imagine how we could be 
  ambiguous about an experience as fundamental as emotion. We all have 
  them. It's the ocean our thoughts swim in, waves taking us to and fro, 
  and sometimes crashing us against the rocks.
  
  
  To unsubscribe, change your address, or temporarily deactivate your 
  subscription, please go to 
  http://v2.listbox.com/member/[EMAIL PROTECTED]
  


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




Re: [agi] AGI's and emotions

2004-02-24 Thread deering



It is true that there is a portion of the process 
of emotion that is not under our conscious control. There are in fact many 
cognitive functions underlying lots of different conscious thoughts that are not 
subject to our introspection or direct control, though perhaps not beyond our 
understanding. We necessarily have limited ability to watch our own 
thought processes, in order to have time to think about the important stuff, and 
to avoid an infinite regress. This limitation is "hardwired" in our 
design. The ability to selectively observe and control any cognitive 
function is a possible design option in an AI. The fact that there will 
not be time or resources to monitor every mental process, that most will be 
automatic, does not make it emotion. Lack of observation, and lack of 
control, do not mean lack of understanding. 

I agree that there will necessarily be automatic 
functions in a practical mind. I don't agree that these processes have to 
be characterized or shaped as emotions. I expect to see emotional AI's and 
non-emotional AI's. We don't know enough yet to predict which will 
function better.

1. highly emotional AL. (out of 
control)

2. moderately emotional AI. (like us, 
undependable)

3. slightly emotional AI. (your 
supposition, possibly good)

4. non-emotional AI. (my choice, 
including simulated emotions for human interaction)


Mike Deering.




To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




RE: [agi] AGI's and emotions

2004-02-24 Thread Ben Goertzel




I 
don't claim that all unmonitored thought processes are emotional, of 
course

I 
think that the most abstract description of emotion is "mental processes outside 
the scope of free will, resulting in widely-distributed effects across the mind, 
often correlated with physiological responses"

How do 
you define "emotions", Mike?

ben

  -Original Message-From: [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED]On Behalf Of deeringSent: 
  Tuesday, February 24, 2004 3:08 PMTo: 
  [EMAIL PROTECTED]Subject: Re: [agi] AGI's and 
  emotions
  It is true that there is a portion of the process 
  of emotion that is not under our conscious control. There are in fact 
  many cognitive functions underlying lots of different conscious thoughts that 
  are not subject to our introspection or direct control, though perhaps not 
  beyond our understanding. We necessarily have limited ability to watch 
  our own thought processes, in order to have time to think about the important 
  stuff, and to avoid an infinite regress. This limitation is "hardwired" 
  in our design. The ability to selectively observe and control any 
  cognitive function is a possible design option in an AI. The fact that 
  there will not be time or resources to monitor every mental process, that most 
  will be automatic, does not make it emotion. Lack of observation, and 
  lack of control, do not mean lack of understanding. 
  
  I agree that there will necessarily be automatic 
  functions in a practical mind. I don't agree that these processes have 
  to be characterized or shaped as emotions. I expect to see emotional 
  AI's and non-emotional AI's. We don't know enough yet to predict which 
  will function better.
  
  1. highly emotional AL. (out of 
  control)
  
  2. moderately emotional AI. (like us, 
  undependable)
  
  3. slightly emotional AI. (your 
  supposition, possibly good)
  
  4. non-emotional AI. (my choice, 
  including simulated emotions for human interaction)
  
  
  Mike Deering.
  
  
  
  
  To unsubscribe, change your address, or temporarily deactivate your 
  subscription, please go to 
  http://v2.listbox.com/member/[EMAIL PROTECTED]
  


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




RE: [agi] AGI's and emotions

2004-02-24 Thread nandakishor koka
Hi all, 
   
  I had read an article related to the discussion. I feel it could be of some 
inportance. 
http://www.firstscience.com/SITE/ARTICLES/love.asp

Regards, 
Nandakishor


On Wed, 25 Feb 2004 Ben Goertzel wrote :

I don't claim that all unmonitored thought processes are emotional, of
course

I think that the most abstract description of emotion is mental processes
outside the scope of free will, resulting in widely-distributed effects
across the mind, often correlated with physiological responses

How do you define emotions, Mike?

ben
   -Original Message-
   From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf
Of deering
   Sent: Tuesday, February 24, 2004 3:08 PM
   To: [EMAIL PROTECTED]
   Subject: Re: [agi] AGI's and emotions


   It is true that there is a portion of the process of emotion that is not
under our conscious control.  There are in fact many cognitive functions
underlying lots of different conscious thoughts that are not subject to our
introspection or direct control, though perhaps not beyond our
understanding.  We necessarily have limited ability to watch our own thought
processes, in order to have time to think about the important stuff, and to
avoid an infinite regress.  This limitation is hardwired in our design.
The ability to selectively observe and control any cognitive function is a
possible design option in an AI.  The fact that there will not be time or
resources to monitor every mental process, that most will be automatic, does
not make it emotion.  Lack of observation, and lack of control, do not mean
lack of understanding.

   I agree that there will necessarily be automatic functions in a practical
mind.  I don't agree that these processes have to be characterized or shaped
as emotions.  I expect to see emotional AI's and non-emotional AI's.  We
don't know enough yet to predict which will function better.

   1.  highly emotional AL.  (out of control)

   2.  moderately emotional AI.  (like us, undependable)

   3.  slightly emotional AI.  (your supposition, possibly good)

   4.  non-emotional AI.  (my choice, including simulated emotions for human
interaction)


   Mike Deering.





--
   To unsubscribe, change your address, or temporarily deactivate your
subscription, please go to
http://v2.listbox.com/member/[EMAIL PROTECTED]

---
To unsubscribe, change your address, or temporarily deactivate your subscription,
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


Re: [agi] AGI's and emotions

2004-02-23 Thread deering



In your paper you take a stab at defining emotions 
and explaining different kinds of emotions' relationship to goals achievement 
and motivation of important behaviors (fight, flight, reproduction). And 
then you go on to say that AI's will have goals and motivations and important 
behaviors, so of course, AI's will have emotions. I don't exactly 
agree.

I think AI's could have emotions if they were 
designed that way. I don't think this is the only way a mind can 
work. I doubt if it is the best way. Evolution gave feathers to 
birds, and feathers are certainly functional, but I don't think that is any 
excuse to be pasting them on the wings of an F16. Emotions are evolution's 
solution to a motivational problem in biological minds. I don't want my 
computer to stop sending my email because it is depressed about the 
economy.

Emotions...I don't know. Maybe there are some 
applications where they might be useful, dealing with humans. But then the 
emotions could be faked. Humans do it all the time. I'm trying to 
think of a case where real emotions would be a functional advantage to a purpose 
built machine. I can't think of any. Then again, it's late, and I 
have to get to bed. I'll sleep on it.


Mike Deering.


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]




Re: [agi] AGI's and emotions

2004-02-22 Thread Philip Sutton
Hi Ben,

 Question: Will AGI's experience emotions like humans do?
 Answer:
 http://www.goertzel.org/dynapsyc/2004/Emotions.htm

I'm wondering whether *social* organisms are likely to have a more 
active emotional life because inner psychological states need to be 
flagged physiologically to other organisms that need to be able to read 
their states.  This will also apply across species in the case of challenge 
and response situations (buzz off or I'll bite you, etc.).  Your point about 
the physiological states operating outside the mental processes (that 
are handled by the multiverse modeller) being likely to bring on feelings 
of emotion makes sense in a situation involving trans-entity 
communication.  It would be possible for physiologically flagged 
emotional states (flushed face/body, raised hackles, bared teeth snarl, 
broad grin, aroused sexual organs, etc.) to trigger a (pre-patterned?) 
response in another organism on an organism-wide decentralised basis 
- tying in with your idea that certain responses require a degree of 
speed that precludes centralised processing.

So my guess would be that emotions in AIs would be more 
common/stronger if the AIs are *social* (ie. capable of relating to any 
other entitites ie. other AIs or with social biological entities) and they 
are able to both 'read' (and perhaps 'express/flag') psychological states 
- through 'body language' as well as verbal language.

Maybe emotions, as humans experience them, are actually a muddled 
(and therefore interesting!?) hybrid of inner confusion in the multiverse 
modelling system and also a broad patterened communication system 
for projecting and reading *psychological states* where the reason(s) 
for the state is not communicated but the existence of the state is 
regarded (subconsciously?/pre-programmed?) by one or both of the 
parties in the communication as being important.

Will AIs need to be able to share *psychological states* as opposed to 
detailed rational data with other AIs?  If AIs are to be good at 
communicating with humans, then chances are that the AIs will need to 
be able to convey some psychological states to humans since humans 
seem to want to be able to read this sort of information.

Cheers, Philip

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] AGI's and emotions

2004-02-22 Thread Ben Goertzel


Hi,

You've made two comments in two posts; I'll respond to them both together

1) that sociality may be necessary for spiritual joy to emerge in a mind

Response: Clearly sociality is one thing that can push a mind in the
direction of appreciating its oneness with the universe but I don't see
why it's the only thing that can do so  I think the basic intuitive
truths underlying spiritual traditions can be recognized by ANY mind that
is self-aware and reflective, not just by a social mind.  For instance, if a
mind introspects into the way it constructs percepts, actions and objects --
the interpenetration of the perceived and constructed worlds -- then it
can be led down the path of grokking the harmony between the inner and outer
worlds, in a way that has nothing to do with sociality.

2) that sociality will lead to more intense emotions than asociality

Response: I don't think so.  I think that emotions are largely caused by the
experience of having one's mind-state controlled by internal forces way
outside one's will  Now, in humans, some of these responses are
specifically induced by other humans or animals -- therefore some of our
emotions are explicitly social in nature.  But this doesn't imply that
emotions are necessarily social, nor that sociality is necessarily
emotional -- at least not in any obvious way that I can see

I suppose you could try to construct an argument that sociality presents
computational problems that can ONLY be dealt with by mental subsystems that
operate in an automated way, outside of the scope of human will
However, I don't at present believe this to be true...

-- Ben G



 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
 Behalf Of Philip Sutton
 Sent: Sunday, February 22, 2004 9:27 AM
 To: [EMAIL PROTECTED]
 Subject: Re: [agi] AGI's and emotions


 Hi Ben,

  Question: Will AGI's experience emotions like humans do?
  Answer:
  http://www.goertzel.org/dynapsyc/2004/Emotions.htm

 I'm wondering whether *social* organisms are likely to have a more
 active emotional life because inner psychological states need to be
 flagged physiologically to other organisms that need to be able to read
 their states.  This will also apply across species in the case of
 challenge
 and response situations (buzz off or I'll bite you, etc.).  Your
 point about
 the physiological states operating outside the mental processes (that
 are handled by the multiverse modeller) being likely to bring on feelings
 of emotion makes sense in a situation involving trans-entity
 communication.  It would be possible for physiologically flagged
 emotional states (flushed face/body, raised hackles, bared teeth snarl,
 broad grin, aroused sexual organs, etc.) to trigger a (pre-patterned?)
 response in another organism on an organism-wide decentralised basis
 - tying in with your idea that certain responses require a degree of
 speed that precludes centralised processing.

 So my guess would be that emotions in AIs would be more
 common/stronger if the AIs are *social* (ie. capable of relating to any
 other entitites ie. other AIs or with social biological entities)
 and they
 are able to both 'read' (and perhaps 'express/flag') psychological states
 - through 'body language' as well as verbal language.

 Maybe emotions, as humans experience them, are actually a muddled
 (and therefore interesting!?) hybrid of inner confusion in the multiverse
 modelling system and also a broad patterened communication system
 for projecting and reading *psychological states* where the reason(s)
 for the state is not communicated but the existence of the state is
 regarded (subconsciously?/pre-programmed?) by one or both of the
 parties in the communication as being important.

 Will AIs need to be able to share *psychological states* as opposed to
 detailed rational data with other AIs?  If AIs are to be good at
 communicating with humans, then chances are that the AIs will need to
 be able to convey some psychological states to humans since humans
 seem to want to be able to read this sort of information.

 Cheers, Philip

 ---
 To unsubscribe, change your address, or temporarily deactivate
 your subscription,
 please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] AGI's and emotions

2004-02-22 Thread Philip Sutton
Hi Ben,  

Why would an AGI be driven to achieve *general* harmony between 
inner and outer worlds - rather than just specific cases of congruence? 

Why would a desire for specific cases of congruence between the inner 
and outer worlds lead an AGI (that is not programmed or trained to do 
so) to appreciate (desire??) to want to be at one with the *universe* 
(when you use that term do you mean the Universe or just the outer 
world?)?  

And is a desire to seek *general* congruence between the inner and 
outser world via changing the world rather changing the self a good 
recipe for creating a megalomaniac?

Cheers, Philip

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] AGI's and emotions

2004-02-22 Thread Ben Goertzel


 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
 Behalf Of Philip Sutton
 Sent: Sunday, February 22, 2004 12:41 PM
 To: [EMAIL PROTECTED]
 Subject: RE: [agi] AGI's and emotions


 Hi Ben,

 Why would an AGI be driven to achieve *general* harmony between
 inner and outer worlds - rather than just specific cases of congruence?

If one of its guiding principles is to seek maximum joy -- and (as I've
hypothesized) the intensity of a quale is proportional to the size of the
pattern to which the quale is attached -- then it will seek general harmony
because this is a bigger pattern than more specialized harmony.

 Why would a desire for specific cases of congruence between the inner
 and outer worlds lead an AGI (that is not programmed or trained to do
 so) to appreciate (desire??) to want to be at one with the *universe*
 (when you use that term do you mean the Universe or just the outer
 world?)?

The desire for inner/outer congruence is a special case of the desire for
pattern-finding, as manifested in the desires for Growth and Joy that I've
posited as desirable guiding principles...

 And is a desire to seek *general* congruence between the inner and
 outser world via changing the world rather changing the self a good
 recipe for creating a megalomaniac?

This is the sort of reason why I don't posit Joy and Growth in themselves as
the ideal ethic.

Adding Choice ot the mix provides a principle-level motivation not to impose
one's own will upon the universe without considering the wills of others as
well...

ben g

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]


RE: [agi] AGI's and emotions

2004-02-22 Thread Philip Sutton



Hi Ben, 

 Adding Choice to the mix provides a principle-level motivation not to
 impose one's own will upon the universe without considering the wills
 of others as well... 

Whose choice - everyone or the AGI? That has to be specified in the 
ethic - otherwise it could be the AGI only - then the AGI would 
*certainly* consider the wills of others as well but only to see that 
they did not block the will of the AGI. 


A non-carefully structured goal set leading to the pursuit of 
choice/growth/joy could still lead to a megalomaniac, seems to me.


Cheers, Philip






To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]





RE: [agi] AGI's and emotions

2004-02-22 Thread Ben Goertzel




Yes, 
of course a brief ethical slogan like "choice, growth and joy" is underspecified 
and all the terms need to be better defined, either by example or by formal 
elucidation, etc. I carry out some of this elucidation in the Encouraging 
a Positive Transcension essay that triggered this whole 
dialogue...

ben 
g

  -Original Message-From: [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED]On Behalf Of Philip 
  SuttonSent: Sunday, February 22, 2004 7:57 PMTo: 
  [EMAIL PROTECTED]Subject: RE: [agi] AGI's and 
  emotions
  Hi Ben, 
  
  
   Adding Choice to the mix provides a 
  principle-level motivation not to
   impose one's own will upon the universe without 
  considering the wills
   of others as well... 
  
  Whose choice - 
  everyone or the AGI? That has to be specified in the ethic - otherwise 
  it could be the AGI only - then the AGI would *certainly* "consider the wills 
  of others as well" but only to see that they did not block the will of 
  the AGI. 
  
  A non-carefully 
  structured goal set leading to the pursuit of choice/growth/joy could still 
  lead to a megalomaniac, seems to me.
  
  Cheers, 
  Philip
  
  
  
  
  To unsubscribe, change your address, or temporarily deactivate your 
  subscription, please go to 
  http://v2.listbox.com/member/[EMAIL PROTECTED]
  


To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]