|
It is true that there is a portion of the process
of emotion that is not under our conscious control. There are in fact many
cognitive functions underlying lots of different conscious thoughts that are not
subject to our introspection or direct control, though perhaps not beyond our
understanding. We necessarily have limited ability to watch our own
thought processes, in order to have time to think about the important stuff, and
to avoid an infinite regress. This limitation is "hardwired" in our
design. The ability to selectively observe and control any cognitive
function is a possible design option in an AI. The fact that there will
not be time or resources to monitor every mental process, that most will be
automatic, does not make it emotion. Lack of observation, and lack of
control, do not mean lack of understanding.
I agree that there will necessarily be automatic
functions in a practical mind. I don't agree that these processes have to
be characterized or shaped as emotions. I expect to see emotional AI's and
non-emotional AI's. We don't know enough yet to predict which will
function better.
1. highly emotional AL. (out of
control)
2. moderately emotional AI. (like us,
undependable)
3. slightly emotional AI. (your
supposition, possibly good)
4. non-emotional AI. (my choice,
including simulated emotions for human interaction)
Mike Deering.
To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED] |
- Re: [agi] AGI's and emotions deering
- Re: [agi] AGI's and emotions Bill Hibbard
- Re: [agi] AGI's and emotions deering
- Re: [agi] AGI's and emotions Bill Hibbard
- RE: [agi] AGI's and emotions Ben Goertzel
- RE: [agi] AGI's and emotions Bill Hibbard
- Re: [agi] AGI's and emotions Kevin
- RE: [agi] AGI's and emotions Ben Goertzel
- RE: [agi] AGI's and emotions Brad Wyble
- RE: [agi] AGI's and emotions Philip Sutton
- RE: [agi] AGI's and emotions Brad Wyble
