Hal Finney writes:

Consider an experiment where we are simulating someone and can give
them either a good or bad experience.  These are not replays, they are
new experiences which we can accurately anticipate will be pleasant
or unpleasant.

Suppose we are going to flip a biased quantum coin, one which has a 90%
chance of coming up heads.  We will generate the good or bad experience
depending on the outcome of the coin flip.  I claim that it is obvious
that it is better to give the good experience when we get the 90% outcome
and the bad experience when we get the 10% outcome.  That's the assumption
I will start with.

Now consider Tegmark's level 1 of parallelism, the fact that in a
sufficiently large volume of space I can find a large number of copies
of me, in fact copies of the entire earth and our entire visible universe
(the "Hubble bubble"?).  When I do my quantum coin flip, 90% of the copies
will see it come up heads and cause the good experience for the subject,
and 10% will see tails and cause the bad experience.

I will also assume that my knowledge of this fact about the physical
universe will not change my mind about the ethical value of my decision
to give the good experience for the 90% outcome.

Now the problem is this.  There are really only two different programs
being run for our experimental subject, the guy in the simulation.  One is
a good experience and one is bad.  All my decision does is to change how
many copies of each of these two programs are run.  In making my decision
about which experiences to assign to the two coin flip outcomes, I have
chosen that the copies of the good experience will outnumber copies of
the bad experience by 9 to 1.

But if I don't believe that the number of copies being run makes a
difference, then I haven't accomplished what I desired.  The fact that
I am running more copies of the good program than the bad wouldn't make
any difference.  Therefore there is no actual ethical value in what I
have done, I might have just as validly reversed the outcome of my coin
flips and it wouldn't have made any difference.

Here is another way of explaining this situation. When there are multiple parallel copies of you, you have no way of knowing which copy you are, although you definitely are one of the copies during any given moment, with no telepathic links with the others or anything like that. If a proportion of the copies are painlessly killed, you notice nothing, because your successor OM will be provided by one of the copies still going (after all, this is what happens in the case of teleportation). Similarly, if the number of copies increases, you notice nothing, because during any given moment you are definitely only one of the copies, even if you don't know which one. However, if your quantum coin flip causes 90% of the copies to have bad experiences, you *will* notice something: given that it is impossible to know which particular copy you are at any moment, or which you will be the next moment, then there is a 90% chance that you will be one of those who has the bad experience. Similarly, if you multiply the number of copies tenfold, and give all the "new" copies bad experiences, then even though the "old" copies are left alone, you will still have a 90% chance of a bad experience, because it is impossible to know which copy will provide your next OM.

So, perhaps counterintuitively, you and all your copies are better off if all but one is painlessly killed than if the total number is increased and a proportion of the new copies given a bad experience. This is what I was trying to show in my post "another puzzle". I think this way of looking at it is simple, consistent, does not require any new physical laws, and provides a reason to do good things rather than bad things in the multiverse, as long as you don't make the terrible mistake of assuming that the absolute measure of copies with good experiences is more important than the relative measure.

Sell your car for $9 on carpoint.com.au http://www.carpoint.com.au/sellyourcar

Reply via email to