Re: [agi] Within-cell computation in biological neural systems??
From: Brad Wyble [EMAIL PROTECTED] Nonlinear dendritic integration can be accurately captured by the comparmental model which divides dendrites into small sections with ion channels and other internal reaction mechanisms. This is the most accurate level of modeling. It may be possible to simplify this model with machine learning techniques and without significant loss in accuracy. I am well aware of compartmental modelling and have done it myself. But this type of model only accounts for the physical size/character of a dendrite, ignoring, in principle, a whole raft of complex molecular dynamics of what might be occuring inside it. Such molecular dynamics will sure contribute to the nonlinear aspects of a dendrite. Each compartment can have internal models of ligand- and voltage- -gated channels, thier de-/phosphorylation and other forms of neuromodulation, etc. So that's one more level of organization, but no more than that. This can be incorporated in the compartmental framework. Just as an example, a new type of neuron has recently been discovered that can hold a steady state of firing in isolation, apply current, rate increases and remains stable at a new threshold. It's dynamically settable, which blows away all standard Integrate Fire models. I don't know the exact mechanisms that give rise to that type of neurons, but the comparmental model should be able to cover this. What is needed is a large-scale database of neuronal characteristics (automation). Yes, one can create a model of a neuron that does this, it's already been done. It's far from a standard model though. The problem here is that we do not have enough *data* about that neuronal cell-type in question. The basic formulation of the model is OK. We just need to plug in the database; which requires large scale automated bioassays. My point, however, was that there is an entire world of complexity within the cell that will be relevant to its role in a neural network (as opposed to simply metabolic) that we are just beginning to understanding. Unless you're talking about complex intracellular information processing, which I already explained there is no evidence of such so far. I'm currently looking into simple organisms to try to get more decisive clues to this issue. I'm optimistic. In the end, there is only one difference: the difference between 'do' and 'talk'. And once you've decided to do it, there is only one direction to go. I'm willing to work on it even though I'm not certain of success. It's an emotional thing I guess =) YKY Find what you are looking for with the Lycos Yellow Pages http://r.lycos.com/r/yp_emailfooter/http://yellowpages.lycos.com/default.asp?SRC=lycos10 --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
RE: [agi] AGI's and emotions
Bill, I think that emotions in humans are CORRELATED with value-judgments, but are certainly not identical to them. We can have emotions that are ambiguous in value, and we can have strong value judgments with very little emotion attached to them. -- Ben G Bill, I agree with you that emotions are tied to motivation of behavior in humans. Humans prefer the experience of some emotions and avoid the experience of others, and therefore generate their behavior to maximize these goals. I think this is a peculiarly biological situation and need now be replicated in AI's. I think in AI's we have the design option to base the motivation of behavior on more rational grounds. I would say that behavior of any intelligence must be motivated by values for distinguishing good and bad outcomes, and that emotion is essentially just a word we use for those values in humans. Of course, an AI need not express its values as humans do, through facial expressions, body language, and tone of voice. If an AI needs to communicate with humans, a way of mimicking human emotional expressions will be useful for that communication. Cheers, Bill --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
RE: [agi] AGI's and emotions
Ben, I think that emotions in humans are CORRELATED with value-judgments, but are certainly not identical to them. We can have emotions that are ambiguous in value, and we can have strong value judgments with very little emotion attached to them. That is reasonable. As I said in my first post on this topic, there is variation in the way people define emotion. The quotes from Edelman and Crick show some precedence for defining emotion essentially as value, but it is also common to define emotion more in terms of expression or physiological response. Cheers, Bill --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
p.s., RE: [agi] AGI's and emotions
I said: That is reasonable. As I said in my first post on this topic, there is variation in the way people define emotion. The quotes from Edelman and Crick show some precedence for defining emotion essentially as value, but it is also common to define emotion more in terms of expression or physiological Another definition of emotion may be in terms of qualia. Bill --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
RE: [agi] AGI's and emotions
On Wed, 25 Feb 2004, Ben Goertzel wrote: Emotions ARE thoughts but they differ from most thoughts in the extent to which they involve the primordial brain AND the non-neural physiology of the body as well. This non-brain-centricity means that emotions are more out of 'our' control than most thoughts, where 'our' refers to the modeling center of the brain that we associate with the feeling of 'free will.' -- Ben G I would agree with this. Emotions seem to arise from parts of the brain that your central executive has minimal control over. They can be suppressed and manipulated with effort but they are distinct from the character of thoughts originating in other parts of the brain. It's probably a mistake to characterize emotions as a unitary phenomenon though. Different emotions have different functions, and likely originate from different structures. --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
RE: [agi] AGI's and emotions
I guess we call emotions 'feelings' because we feel them - ie. we can feel the effect they trigger in our whole body, detected via our internal monitoring of physical body condition. Given this, unless AGIs are also programmed for thoughts or goal satisfactions to trigger 'physical' and/or other forms of systemic reaction, I suppose their emotions will have a lot less 'feeling' depth to them than humans and other biological species experience. That's not the entirety of the difference between emotions and other types of thoughts. A reasoning entity can detect that their thoughts are under the influence of an emotion. For example, consider being in a road rage situation, which I'm sure we can all relate to. You know full well that your reaction of anger towards someone who's unwittingly committed a minor offense to you is wildly irrational and yet you can't help but feel a flash of extreme animosity towards someone else (or maybe your steering wheel :)). The fact that you know it's an emotional reaction doesn't prevent you from feeling its effects on your thoughts, it just lets you handle it without acting on it. So any entity capable of remembering their thought processes would be able to detect the influence of an emotion (at least the human variety) on the current flow of their thoughts even without body-state markers. -Brad --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
Re: [agi] AGI's and emotions
Philip Sutton wrote: I guess we call emotions 'feelings' because we *feel *them - ie. we can feel the effect they trigger in our whole body, detected via our internal monitoring of physical body condition. Given this, unless AGIs are also programmed for thoughts or goal satisfactions to trigger 'physical' and/or other forms of systemic reaction, I suppose their emotions will have a lot less 'feeling' depth to them than humans and other biological species experience. It seems to me an AI would not require emotions in order to have *motivations*. Emotions may be necessary to provide a sense of self on the level we associate with human consciousness, however, I don't see that as being of much long term practical value, and more likely to be an impediment, in a practical AI or other highly advanced intelligence. - Jef --- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
RE: [agi] AGI's and emotions
Mike, Regarding your definition of emotion. Ialmost agree with what you say -- BUT, I think you're missing a basic point. Emotions do involve data coming into the cognitive centers, vaguely similarly to how perceptual data comes into the cognitive centers. And, as with perception, emotions involve processing that goes on in areas of the brain that are mostly opaque to the cognitive centers. But in the case of emotion, the data comes in from a broadly distributed set of physiological and kinesthetic indicators -- AND from parts of the brain that are concerned with reaction to stimuli and goal-achievement rather than just perceiving. This is qualitatively different than data feeding in from sensors Emotions are more similar to unconscious reflex actions than to sensation per se -- but they last longer and are more broadly-based than simple reflex actions... ben g -Original Message-From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]On Behalf Of deeringSent: Wednesday, February 25, 2004 2:19 AMTo: [EMAIL PROTECTED]Subject: Re: [agi] AGI's and emotions Bill, I agree with you that emotions are tied to motivation of behavior in humans. Humans prefer the experience of some emotions and avoid the experience of others, and therefore generate their behavior to maximize these goals. I think this is a peculiarly biological situation and need now be replicated in AI's. I think in AI's we have the design option to base the motivation of behavior on more rational grounds. Ben, I don't know if my personal definition of emotions will be of much help as it may not be shared by a very large community. but for what it's worth, here it is. MIKE DEERING'S PERSONAL DEFINITION OF EMOTIONS: Emotions are a kind of sensory data. The sensory organ that perceives this data is the conscious mind alone. The physical reality which generates this raw data are selected concentrations of neurotransmitters in the brain. Their effects vary with different types of neurons in different locations. Some types of neurons produce more of certain kinds of neurotransmitter than other types of neurons. Those that generate the neurotransmitters are not necessarily the same as those that are more affected. They are also affected by other chemicalsproduced by glands. It's complicated. These neurochemical phenomena are by evolutionary design causally linked to environmental circumstances and divided into positive and negative type. They are used, by evolutionary design, to positively and negatively reinforce behaviors to maximize and minimize the related circumstances. Emotions are not products of cognitive processes but are rather perceptions of neurochemical states and states of activation of selected regions of the brain. Because of the complicated feedback arrangements in the generation of neurotransmitters and hormones, and the neurons role in this feedback, some limited conscious influence can be exercised in the management of emotions. Emotions can be generated artificially by the introduction of various chemicals to the brain, the direct electrical stimulation of certain neuron clusters, or direct control of environmental circumstances. Certain physical bodily sensations are closely related to emotions: pain to sadness, pleasure to happiness. To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED] To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
RE: [agi] AGI's and emotions
Agreed --- we tend to project even abstract experiences back down to our physical layer, and then react to them physically ... a kind of analogy that AGI's are unlikely to pursue so avidly unless specifically designed to do so ben g -Original Message-From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]On Behalf Of Philip SuttonSent: Wednesday, February 25, 2004 12:00 PMTo: [EMAIL PROTECTED]Subject: RE: [agi] AGI's and emotions Emotions ARE thoughts but they differ from most thoughts in the extent to which they involve the "primordial" brain AND the non-neural physiology of the body as well. I guess we call emotions 'feelings' because we feel them - ie. we can feel the effect they trigger in our whole body, detected via our internal monitoring of physical body condition. Given this, unless AGIs are also programmed for thoughts or goal satisfactions to trigger 'physical' and/or other forms of systemic reaction, I suppose their emotions will have a lot less 'feeling' depth to them than humans and other biological species experience. Cheers, Philip To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED] To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
RE: [agi] AGI's and emotions
Title: Message Folks interested in this thread should check out the draft of Marvin Minsky's upcoming book "The Emotion Machine". Been available at his web site for quite some time: http://web.media.mit.edu/~minsky/ The current draft doesn't seem to have an executive summary that lays outthemain thesis, but in a 12/13/99 posting (http://www.generation5.org/content/1999/minsky.asp), Minsky says: The central idea is that emotion is not different from thinking. Instead, each emotion is a type or arrangement of thinking. There is no such thing as unemotional thinking, because there always must be a selection of goals, and a selection of resources for achieving them. From my notesafter skimming some of the book about a year ago, it seemed thatMinsky sees emotions as kinds of "presets" (his term - "Selectors") that determine what mind resources and goals are active at a given time to solve a particular "problem". [I seem to recall Antonio Damasio also had a similar conception... and he called the emotional "set points" PATTERNS!] The following isfrom the draft of Chapter 1 Section 6: Each of our major emotional states results from switching the set of resources in useby turning certain ones on and other ones off. Any such change will affect how we think, by changing our brains activities. In other words, our emotional states are not separate and distinct from thoughts; instead, each one is a different way to think. For example, when an emotion like Anger takes over, you abandon some of your ways to make plans. You turn off some safety-defenses. You replace some of your slower-acting resources with ones that tend to more quickly reactand to do with more speed and strength. You trade empathy for hostility, change cautiousness into aggressiveness, and give less thought to the consequences. And then it may seem (to both you and your friends) that youve switched to a new personality. Good stuff! (IMHO) J. W. Johnston -Original Message-From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Ben GoertzelSent: Wednesday, February 25, 2004 11:25 AMTo: [EMAIL PROTECTED]Subject: RE: [agi] AGI's and emotions Agreed --- we tend to project even abstract experiences back down to our physical layer, and then react to them physically ... a kind of analogy that AGI's are unlikely to pursue so avidly unless specifically designed to do so ben g -Original Message-From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]On Behalf Of Philip SuttonSent: Wednesday, February 25, 2004 12:00 PMTo: [EMAIL PROTECTED]Subject: RE: [agi] AGI's and emotions Emotions ARE thoughts but they differ from most thoughts in the extent to which they involve the "primordial" brain AND the non-neural physiology of the body as well. I guess we call emotions 'feelings' because we feel them - ie. we can feel the effect they trigger in our whole body, detected via our internal monitoring of physical body condition. Given this, unless AGIs are also programmed for thoughts or goal satisfactions to trigger 'physical' and/or other forms of systemic reaction, I suppose their emotions will have a lot less 'feeling' depth to them than humans and other biological species experience. Cheers, Philip To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED] To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED] To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
Re: [agi] AGI's and emotions
Title: Message I'll add one last point here..the Dalai Lama, when talking with western intelligenicia from various disciplines at Harvard ( I think it was Harvard) was asked a question about emotions. He got a very puzzled look on his face. It turned out that the Tibetans, due to their study of the mind, made no distinction between ordinary thought and emotion. So the idea of "emotion" being separate from thought was completely foreign to HHDL.. My own experience tells me that *all* thoughts carry a physiolocial component..there is no separation between the body and mind in this sense. It's just that most thoughts affect on our physiology flies under the radar of our everyday awareness...So we only really notice the major emotions/thoughts due to this kind of numbness. But the accumulation of physiological responses from subtle negative thinking can have a very profoundly bad effect on us over time...I think an AGI will also need to watch these subtle accumulations.. --Kevin - Original Message - From: J. W. Johnston To: [EMAIL PROTECTED] Sent: Wednesday, February 25, 2004 5:36 PM Subject: RE: [agi] AGI's and emotions Folks interested in this thread should check out the draft of Marvin Minsky's upcoming book "The Emotion Machine". Been available at his web site for quite some time: http://web.media.mit.edu/~minsky/ The current draft doesn't seem to have an executive summary that lays outthemain thesis, but in a 12/13/99 posting (http://www.generation5.org/content/1999/minsky.asp), Minsky says: The central idea is that emotion is not different from thinking. Instead, each emotion is a type or arrangement of thinking. There is no such thing as unemotional thinking, because there always must be a selection of goals, and a selection of resources for achieving them. From my notesafter skimming some of the book about a year ago, it seemed thatMinsky sees emotions as kinds of "presets" (his term - "Selectors") that determine what mind resources and goals are active at a given time to solve a particular "problem". [I seem to recall Antonio Damasio also had a similar conception... and he called the emotional "set points" PATTERNS!] The following isfrom the draft of Chapter 1 Section 6: Each of our major emotional states results from switching the set of resources in useby turning certain ones on and other ones off. Any such change will affect how we think, by changing our brains activities. In other words, our emotional states are not separate and distinct from thoughts; instead, each one is a different way to think. For example, when an emotion like Anger takes over, you abandon some of your ways to make plans. You turn off some safety-defenses. You replace some of your slower-acting resources with ones that tend to more quickly reactand to do with more speed and strength. You trade empathy for hostility, change cautiousness into aggressiveness, and give less thought to the consequences. And then it may seem (to both you and your friends) that youve switched to a new personality. Good stuff! (IMHO) J. W. Johnston -Original Message-From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Ben GoertzelSent: Wednesday, February 25, 2004 11:25 AMTo: [EMAIL PROTECTED]Subject: RE: [agi] AGI's and emotions Agreed --- we tend to project even abstract experiences back down to our physical layer, and then react to them physically ... a kind of analogy that AGI's are unlikely to pursue so avidly unless specifically designed to do so ben g -Original Message-From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]On Behalf Of Philip SuttonSent: Wednesday, February 25, 2004 12:00 PMTo: [EMAIL PROTECTED]Subject: RE: [agi] AGI's and emotions Emotions ARE thoughts but they differ from most thoughts in the extent to which they involve the "primordial" brain AND the non-neural physiology of the body as well. I guess we call emotions 'feelings' because we feel them - ie. we can feel the effect they trigger in our whole body, detected via our internal monitoring of physical body condition. Given this, unless AGIs are also programmed for thoughts or goal satisfactions to trigger 'physical' and/or other forms of systemic reaction, I suppose their emotions will have a lot less 'feeling' depth to them than humans and other biological species experience. Cheers, Philip To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED] To