|
In your paper you take a stab at defining emotions
and explaining different kinds of emotions' relationship to goals achievement
and motivation of important behaviors (fight, flight, reproduction). And
then you go on to say that AI's will have goals and motivations and important
behaviors, so of course, AI's will have emotions. I don't exactly
agree.
I think AI's could have emotions if they were
designed that way. I don't think this is the only way a mind can
work. I doubt if it is the best way. Evolution gave feathers to
birds, and feathers are certainly functional, but I don't think that is any
excuse to be pasting them on the wings of an F16. Emotions are evolution's
solution to a motivational problem in biological minds. I don't want my
computer to stop sending my email because it is depressed about the
economy.
Emotions...I don't know. Maybe there are some
applications where they might be useful, dealing with humans. But then the
emotions could be faked. Humans do it all the time. I'm trying to
think of a case where real emotions would be a functional advantage to a purpose
built machine. I can't think of any. Then again, it's late, and I
have to get to bed. I'll sleep on it.
Mike Deering.
To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED] |
- Re: [agi] AGI's and emotions deering
- RE: [agi] AGI's and emotions Ben Goertzel
- Re: [agi] AGI's and emotions deering
- RE: [agi] AGI's and emotions Ben Goertzel
- Re: [agi] AGI's and emotions deering
- Re: [agi] AGI's and emotions Bill Hibbard
- Re: [agi] AGI's and emotions deering
- Re: [agi] AGI's and emotions Bill Hibbard
- RE: [agi] AGI's and emotions Ben Goertzel
- RE: [agi] AGI's and emotions Bill Hibbard
- Re: [agi] AGI's and emotions Kevin
