You claimed that an AGI is "critically unstable due to evolutionary
pressure" unless it "is primarily interested in its own self
preservation".
To me, "I did not claim that a primary interest in self preservation
was a necessary feature" seems to directly contradict this.
I see no contradiction at all. You could build your AGI that didn't
have self preservation as a primary goal. I have no problem with
that, indeed it's probably a good thing. Clearly then, I don't consider
this to be a strictly necessary feature. My concern is that your AGI
won't be long term stable. Where's the contradiction?
The central point is, that I might not value my own existence for it's
own sake, but if I want to e.g. see to it that some other individuals
survive and are happy, I will do what is necessary to ensure my
continued existence, if that is required for me to make sure that
those other individuals survive and are happy.
Sure, I accept that self preservation can arise as a derived goal
from some other goal. Basically, if you're dead then you can't
achieve much of anything and thus staying alive is an important
part of achieving many things.
An AGI-example: we might have a superintelligence that only cares
about the happiness of humans, and about it's own continued existence
only insofar as that is necessary to ensure the happiness of humans.
Such a superintelligence does not have self-preservation as a primary
(which I take to mean non-derived) interest, but it suffers from no
relevant evolutionary disadvantage because of this. It will resist
with all it's might any scenario in which it perishes in a way that
endangers the happiness of humans.
It would not resist scenarios, where it's destruction is necessary for
the happiness of humankind, which I see as a nice feature.
It certainly is a nice feature. However the fact that the AGI is willing to
destroy itself in situations where an AGI that was primarily interested in
its own self preservation wouldn't, seems to support my argument rather
than yours?
Shane
To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
