--- Mark Waser <[EMAIL PROTECTED]> wrote: > And thus, we get back to a specific answer to jk's second question. "*US*" > should be assumed to apply to any sufficiently intelligent goal-driven > intelligence. We don't need to define "*us*" because I DECLARE that it > should be assumed to include current day humanity and all of our potential > descendants (specifically *including* our Friendly AIs and any/all other > "mind children" and even hybrids). If we discover alien intelligences, it > should apply to them as well.
Actually, I like this. I presume that showing empathy to any intelligent, goal driven agent means acting in a way that helps the agent achieve its goals, whatever they are. This aligns nicely with some common views of ethics, e.g. - A starving dog is intelligent and has the goal of eating, so the friendly action is to feed it. - Giving a dog a flea bath is friendly because dogs are more intelligent than fleas. - Killing a dog to save a human life is friendly because a human is more intelligent than a dog. - Killing a human to save two humans is friendly because two humans are more intelligent than one. My concern is what happens if a UFAI attacks a FAI. The UFAI has the goal of killing the FAI. Should the FAI show empathy by helping the UFAI achieve its goal? I suppose the question could be answered by deciding which AI is more intelligent. But how is this done? A less intelligent agent will not recognize the superior intelligence of the other. For example, a dog will not recognize the superior intelligence of humans. Also, we have IQ tests for children to recognize prodigies, but no similar test for adults. The question seems fundamental because a Turing machine cannot distinguish a process of higher algorithmic complexity than itself from a random process. Or should we not worry about the problem because the more intelligent agent is more likely to win the fight? My concern is that evolution could favor unfriendly behavior, just as it has with humans. -- Matt Mahoney, [EMAIL PROTECTED] ------------------------------------------- agi Archives: http://www.listbox.com/member/archive/303/=now RSS Feed: http://www.listbox.com/member/archive/rss/303/ Modify Your Subscription: http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b Powered by Listbox: http://www.listbox.com
