On 9/11/2016 4:07 AM, Telmo Menezes wrote:
Hi Brent,
On Sat, Sep 10, 2016 at 8:29 PM, Brent Meeker <[email protected]> wrote:
Good paper.
Thanks!
Many of the thoughts I've had about the subject too. But I
think your use of persistence is misleading. There are different ways to
persist. Bacteria persist, mountains persist - but very differently.
Ok, I talk about persistence in the very specific sense of Dawkin's
selfish gene. Forward propagation of information in a system of
self-replicators.
The
AI that people worry about is one that modifies it's utility function to be
like humans, i.e. to compete for the same resources and persist by
replicating and by annihilating competitors.
That is one type of worry. The other (e.g.: the "paper clip" scenario)
does not require replication. It is purely the worry that side-effects
of maximizing the utility function will have catastrophic
consequences, while the AI is just doing exactly what we ask of it.
You may say that replicating
isn't necessarily a good way to persist and a really intelligent being would
realize this; but I'd argue it doesn't matter, some AI can adopt that
utility function, just as bacteria do, and be a threat to humans, just as
bacteria are.
I don't say that replication is the only way to persist. What I say is
that evolutionary pressure is the only way to care about persisting.
I see caring about persisting and evolutionary pressure as both
derivative from replication. I'm not sure an AI will care about
replication or persistence, or that it can modify it's own utility
function. I think JKC makes a good point that AI cannot forsee their
own actions and so cannot predict the consequences of modifying their
own utility function - which means they can't apply a utility value to
it. Since we're supposing they are smarter than humans (but not
super-Turing) they would realize. On the other hand humans do have
their utility functions change, as least on a superficial level: drugs,
religion, age, love... seem to produce changes in people. It think AI's
will be the same. Even if they can't or won't change their utility
functions as some kind of strategy, they may be changed by accident and
circumstance or even a random cosmic ray. IF such a change puts
replication on the list of valuable things to do, we'll be off to the
Darwinian races. AI's valuing replication will want to persist up to the
point of replicating - not necessarily beyond. Evolutionary pressure is
just shorthand for replicators competing for finite resources needed to
replicate. So my point is the replication is basic: not persistence and
not utility functions.
Brent
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.