Fair enough.  I guess I don't like the idea though of trying to use
human emotional behavior as the goal.  Who wants a computer not in the
mood to do its work, or committing suicide?  It's interesting, sure,
but how useful would a computer being offing itself be.  Suppose ACME
AI sold a software engine to somebody who embedded it in a robot, but
the robot jumped off a skyscraper?

This is just being a bit silly, of course, but it seems like it would
good to set some limits on how much like "human level reasoning" we
are really after.

This just showed up in my news feed on the topic:

http://bigthink.com/experts-corner/decisions-are-emotional-not-logical-the-neuroscience-behind-decision-making

Mike



On Fri, Jun 15, 2012 at 11:42 AM, Charles Hixson
<[email protected]> wrote:
> I believe that an emotional nature is necessary to handling multiple goals
> that may conflict.  This, of course, doesn't say that the emotions will be
> the same as ours, or even similar, but I believe they will exist in anything
> much more complex than an insect.  (I tend to think of insects as being
> hardwired.  They do have goals, but there appears to be a rigid hierarchy,
> and in any one state one goal will have essentially total dominance.  I may
> be underestimating them, of course.)
>
>
> On 06/15/2012 11:28 AM, Mike Archbold wrote:
>>
>> I guess I don't like the premise that AI needs to be like us and
>> exhibit emotions.  Who actually wants an emotional computer?  The
>> human race is no gold standard of intelligent behavior.  These are my
>> opinions...
>>
>> On Fri, Jun 15, 2012 at 9:43 AM, Logan Streondj<[email protected]>
>>  wrote:
>>>
>>> Hey,
>>>
>>> found an interesting video of Siri,
>>> showing some real emotions,
>>> and going outside original programming.
>>>
>>> www.youtube.com/watch?v=a_SRhnis6f8
>>>
>>> It actually goes as far as to commit suicide :-|
>>> Partly out of dissatisfaction with it's end-users.I
>>>
>>> So what do you think,
>>> is this some kind of weird patch update,
>>> or an emergent behavior?
>>>
>>> I know Ben had a somewhat unimpressive convo with Siri earlier,
>>> yet this one, I have to say is quite impressive by contrast.
>>>
>>>
>>> -------------------------------------------
>>> AGI
>>> Archives: https://www.listbox.com/member/archive/303/=now
>>> RSS Feed:
>>> https://www.listbox.com/member/archive/rss/303/11943661-d9279dae
>>> Modify Your Subscription: https://www.listbox.com/member/?&;
>>>
>>> Powered by Listbox: http://www.listbox.com
>>
>>
>> -------------------------------------------
>> AGI
>> Archives: https://www.listbox.com/member/archive/303/=now
>> RSS Feed: https://www.listbox.com/member/archive/rss/303/232072-58998042
>> Modify Your Subscription: https://www.listbox.com/member/?&;
>>
>> Powered by Listbox: http://www.listbox.com
>>
>
>
> --
> Charles Hixson
>
>
>
>
> -------------------------------------------
> AGI
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/11943661-d9279dae
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to