Lee Corbin wrote:

Jesse writes

> Lee Corbin wrote:
> > If I, on the other hand, knew that this wonderful room was going to
> > be available to me on a specific date,... I would enthusiastically
> > pay a good fraction of my net worth for this opportunity.
> >
> >Why?  Why would I do it?  Because logic grabs me by the throat and
> >*forces* me to   :-)
> What is the logic here exactly, though? From a third person point of view,
> why is it objectively "better" to have a lot of copies of you having
> identical good experiences than it is to have only a single copy have the
> same good experience?

First, I think that it's important to remove the qualifier "identical"
here. Would two copies cease to be identical if one atom were out of

I meant something more like "running the same program"--I was thinking in terms of minds running on computers, since that's the only way to insure all the copies run in lockstep. If you're talking about ordinary physical copies, the butterfly effect probably insures their behavior and experiences will diverge in a noticeable macroscopic way fairly quickly.

Hardly.  On another tack, you are the same person, etc., that
you were five minutes ago where strict identicalness isn't even close.

From a third-person POV, why am I the same person? If you don't believe
there's an objective truth about continuity of identity, isn't it just a sort of aesthetic call?

Second, suppose that someone loves you, and wants the best for you.
There are a number of ways to describe this, but the infinite level
one universe is a good one. The person who loves you (and so has such
a true 3rd person point of view) sees you die here on Earth, and is
sad for you. Yet she understands totally that you are still alive in
many, many places 10^10^29 from here. Her most logical retort is that
you should be alive *here* too; that an extra Jesse here is simply
good for Jesse, no matter what is going on far away.

If she finds out that although dead on Earth, you've been copied into
a body out near Pluto, (and have the same quality of life there), she's
once again happy for you.

That's a pretty unhuman kind of "love" though--if a person I know dies, I'm sad because I'll never get to interact with them again, the fact that versions of them may continue to exist in totally unreachable parallel universes isn't much comfort. Obviously my sadness is not because the death of the copy here means that there are only 10^10^29 - 1 copies of that person rather than 10^10^29 copies.

By the same token, the relief I'd feel knowing that there's a backup copy who's living on Pluto has to do with the fact that the potential for meeting and interacting now exists again, that all the information in my friend's brain hasn't been lost to me forever. But if I find out that there are *two* backup copies running in lockstep on Pluto, that doesn't make me any happier than I was before, in fact I wouldn't feel a twinge of sadness if one of the copies was deleted to save room on the Plutonian hard drive.

> After all, if they lied to you and never made any copies at all,
> no version of you would ever know the difference.

Well, lots of things can go better or worse for me without me
being informed of the difference. Someone might perpetrate a
scam on me, for example, that cheated me of some money I'd
otherwise get, and it is still bad for me even if I don't know
about it.

OK, in that case there are distinct potential experiences you might have had that you now won't get to have. But in the case of a large number of copies running in lockstep, there are no distinct experiences the copies will have that a single copy wouldn't have.

> Also, wouldn't the same logic tell you that if we lived in a utopian society
> where pretty much everyone was happy, it would be immoral to use birth
> control because we want to make the number of people having happy
> experiences as high as possible?

Yes, exactly.  Each time we can rescue someone from non-existence,
we should (given that other things are equal).

But if there's already at least one instantiation of a particular program, should we count additional instantiations as distinct "someones"? Aren't they just multiple copies of the same "someone", so as long as there's at least one copy, that "someone" exists?

> That doesn't seem like a position that anyone who rejects first-person
> thinking would automatically accept.

They may not. But the two great moral revolutions/revelations of my
life--- (i) cryonics  (ii) the Hedonistic Imperative (www.hedweb.com)
--- lay down that life is better than death, and pleasure is better
than pain.

But neither of these necessarily justify the idea that redundant copies of the same programs are better than single copies of it.

The real problem here is that when we talk about what's "better" from a first-person POV, we just mean what *I* would prefer to experience happening to me; but if you want to only think in terms of a universal objective third-person POV, then you must define "better" in terms of some universal objective moral system, and there doesn't seem to be any "objective" way to decide questions like whether multiple copies of the same happy A.I. are better than single copies.


Reply via email to