Michael, > what are your thoughts on creating an AI devoid of humanlike ego from > the start?
I think egotism is not the same as awareness of a self/other distinction. I think egotism arises when the self assumes that it is automatically more important than any (or most) of the 'other'. If an AGI knows there is a community of entities deserving of moral and empathetic consideration, and that these entities include itself and other entities then I think we have a starting point for an egoless AGI. I guess what I'm saying is that we as designers and educators of AGIs should not load the self/other distinction with an assumption of unequal moral value. If an AGI is to model the world it has to see that there are many agents and that it is one member of that population. Cheers, Philip ------- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]