> In stating that "evil" is the natural result of a strong sense of self, I
> washoping to avoid detailed discussion about good and evil, and instead
> propose a possible direction by which a solution can be found.  Namely, do
> not instill a strong sense of self into the AGI...

This is a very interesting thing to think about, Kevin.

I have two opposing thoughts on this:

1)
Since we humans will be teaching the AGI, and it will be learning by
interacting with humans and reading human literature, it will absorb
something of the human sense of self

2)
Much of our sense of self derives from our physical embodiment.  If an AGI
has sense organs distributed throughout the world (millions of webcams,
weather satellite feeds, medical devices, etc.) it is not going to have the
same sense of embodiment as we do.  It may intrinsically come to feel itself
as distributed throughout the world... hence it may intrinsically NOT
develop the same kind of "self-sense" that we humans naturally seem to.

But I still suspect it will develop *some* sense of self vs. other.  I still
think this is a necessary aspect of intelligence.  A mind needs to
distinguish "that which I can fairly directly control" from "that which I
can control only indirectly."  Not making this distinction leads to
uselessness through stupidity; making this distinction leads to a
self-sense.

To paraphrase an old parable about men and mountains, I'd say


* An unintelligent system does not distinguish self from other
* An intelligent system distinguishes self from other
* A wise and intelligent system realizes that self and other are distinct,
but also the same


-- Ben Goertzel

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]

Reply via email to