On Wed, Nov 5, 2008 at 7:35 AM, Matt Mahoney <[EMAIL PROTECTED]> wrote:
>
>> Personally, I'm not making an AGI that has emotions, and I doubt if
>> emotions are generally desirable in AGIs, except when the goal is to
>> make human companions (and I wonder why people need them anyway, given
>> that there're so many -- *too* many -- human beings around already).
>
> People may want to simulate loved ones who have died, if the simulation is 
> accurate enough to be indistinguishable. People may also want to simulate 
> themselves in the same way, in the belief it will make them immortal.


Yeah, I should qualify my statement:  different people will want
different things out of AGI technology.  Some want brain emulation of
themselves or loved ones, some want android companions, etc.  All
these things take up free energy (a scarce resource on earth), so it
is just a new form of the overpopulation problem.  I am not against
any particular form of AGI application;  I just want to point out that
AGI-with-emotions is not necessary goal of AGI.


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to