--- On Tue, 11/4/08, YKY (Yan King Yin) <[EMAIL PROTECTED]> wrote:

> Personally, I'm not making an AGI that has emotions, and I doubt if
> emotions are generally desirable in AGIs, except when the goal is to
> make human companions (and I wonder why people need them anyway, given
> that there're so many -- *too* many -- human beings around already).

People may want to simulate loved ones who have died, if the simulation is 
accurate enough to be indistinguishable. People may also want to simulate 
themselves in the same way, in the belief it will make them immortal.

-- Matt Mahoney, [EMAIL PROTECTED]



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to