RE: Torture yet again

2005-06-29 Thread Stathis Papaioannou

Lee Corbin writes:

[quoting Stathis Papaioannou]
 Certainly, this is the objective truth, and I'm very fond of the 
objective

 truth. But when we are talking about first person experience, we are not
 necessarily claiming that they provide us with objective knowledge of 
the
 world; we are only claiming that they provide us with objective 
knowledge of

 our first person experience.

Objective knowledge of my first person experience, eh?  I'll
have to ponder that one!  Perhaps it will help if I contrast
it with subjective knowledge of my first person experience  :-)


If I say, I feel that man is a crook, that is a subjective statement about 
a 3rd person fact (the man's honesty), but an objective statement about a 
1st person fact (what I feel about the man).



 If we are to be strictly rational and consistent, it
 is simplest to go to the extreme of saying that *none*
 of the instantiations of an individual are actually the
 same person, which is another way of saying that each
 observer moment exists only transiently. This would mean
 that we only live for a moment, to be replaced by a copy
 who only thinks he has a past and a future.

Mike Perry, in his book Forever For All develops these
from the idea of day-persons, i.e., the idea that you
are not the same person from day to day. But that's
certainly not a satisfactory way of extending our usual
notions into these bizarre realms; you and I want to live
next week because we believe that we are the same persons
we'll be then.  And the idea that we *are* fuzzy sets in
person space permits this.

 We die all the time, so death is nothing to worry about.

On this definition, yes. But this is *such* an impractical
approach. We all know that it's bad for your neighbor when
he dies, despite us and him totally believing in the MWI.
We would like to avoid having to say that we die all the
time.


Impractical is not the first criticism that comes to mind re this belief. 
Suppose it were revealed to you that as part of an alien experiment over the 
past 10,000 years, all Earth organisms with a central nervous system are 
killed whenever they fall asleep and replaced with an exact copy. (Sleep has 
actually been introduced by the aliens specifically for this purpose; 
otherwise, what possible evolutionary advantage could it confer?) Would it 
make any practical difference to your life? Would your attitude towards 
friends and family change? Would you take stimulants and try to stay awake 
as long as possible? Is there anything about how you feel from day to day 
that could be taken as evidence for or against this revelation? If the 
aliens offered to stop doing this in your case in exchange for a substantial 
sum of money, or several years reduction in your (apparent) lifespan, would 
you take up the offer? My answer to all these questions would be no.


--Stathis Papaioannou

_
SEEK: Over 80,000 jobs across all industries at Australia's #1 job site.
http://ninemsn.seek.com.au?hotmail




Re: Torture yet again

2005-06-28 Thread Eugen Leitl
On Mon, Jun 27, 2005 at 10:42:17PM -0700, Lee Corbin wrote:

  No, it's not the same program.
 
 What do you mean?  I am postulating that it *is* the same sequence
 of code bytes, the *same* program. Do you know what I mean when
 I say that program A is the same program as program B?

An instantiated program is much more than a sequence of 
bytes -- it also has state. Most programs do not have much state, but some (AI,
specifically) are completely dominated by state. Another example is numerics,
say, CFD code (which is simple, in numer of lines of code) computing a large 
system (which is not, because it contains TBytes of live data).

The program is a really bad metaphor to describe intelligent observers. It is
cleaner to describe the observer by state, and an engine interatively
transforming the state. Whether the engine is mostly code or an ASIC, or a
block of molecular circuitry doesn't matter from that perspective. 

 It is this same, identical program that is running in two different
 places at the same time (pace relativity). Program A at location
 one is receiving input X and program A at position two is receiving
 input Y. I can't make it any clearer than that.

I understood you perfectly. No, it is not the same program. A chess computer
playing two different games are two distinct individuals. Two chess computers
playing the same game (down to the clock cycle and single bit of state) are
the same program.

Assuming the devices don't store state, they boot up into a defined state,
and then diverge either from system randomness or user input (abstractly, of
course they will immediately diverge from clock skew and I/O with the real
world, but it's only an illustration).

Formally they're both flashed with HyperChess V3.0.4, and sloppily we can
refer to them running the same program version. But these two systems are not
identical, unless synchronized.

  You could say the space between your ears and mine enjoys the
  same physical laws, though. Both the arrangement of matter
  and the state of that matter (frozen-frame picture of spikes
  and gradients, gene activity, etc.etc) are very different.
 
 Of course. That's because the Eugen program is quite different
 from the Lee program. Now, the Eugen 2004 (March 23, 12:00:00)
 program is also somewhat different from the Eugen 2002 program
 (March 23, 12:00:00), but they are *very* similar in many, 
 many ways. So many ways that we are justified in asserting
 that they are for all practical purposes the same person 
 (and the same basic program).

Biology doesn't make a clean distinction between software and hardware.
I agree there is similiarity/homology between me-former and me-today,
but that similiarity is difficult to measure at a low level. Synchronizing
spatially separate discrete systems and make measurements on bit vectors is
something relatively simple, at least in gedanken.

 Lee
 
 P.S. I had great, great difficulty in understanding anything
 that you had to say. I was not able to make most of it out.
 Perhaps you could add some redundancy to your tight prose?

Sorry to be so dense, sometimes I have to post under time constraints, in a
distracting environment. Will try to mend in future.

-- 
Eugen* Leitl a href=http://leitl.org;leitl/a
__
ICBM: 48.07100, 11.36820http://www.leitl.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE


signature.asc
Description: Digital signature


Re: Torture yet again

2005-06-27 Thread Eugen Leitl
On Sun, Jun 26, 2005 at 10:53:31AM -0700, Lee Corbin wrote:

  You can be in two places at the same time, but you can't
  enjoy two different scenarios, or think individual thoughts.
 
 I disagree.  Again, you slide back and forth between instantiations
 and programs, which, as you know, are not the same thing. What you

No, a system consists of a state, and iterated transformation 
upon a state. The physical system human, and physical laws 
acting upon it.

The assembly of bits in a computer, and iterated transformation the
computational engine is applying upon it. Whether that engine is software or
hardware, is only relevant for implementation reasons.

For complex organisms state dominates over the engine in terms of number of
bits and complexity of its evolution.

 have written is true of an instance. Were we to be completely

An instance is a process, execution of a static image. Processes are only the
same when their trajectory (system evolution over state space) is identical.

 consistent using your terminology, then we would have to say
 that you could not think A and then think B, because each instance
 of you (in time, this time) cannot think more than one thing.

How do you measure whether two instances are the same? By comparing each
individual frame of the trajectory, bit by bit. If A is a sequence of frames
as is B, both belonging to the same system evolving in time, they will not be
same, unless forced by external constraints. Panta rhei, I am no longer the
person I was yesteryear, etc. 

You have to look for more abstract homologies, extracting features from 
the trajectory, and comparing them.

Two synchronized systems produce the same trajectory, by definition.

 A program can run in two different places at the same time, and
 the program (treated as the pattern) is perfectly capable of
 receiving input X in one location at the same time that it 

No, program is the wrong model. You can have identical pieces of a bit
pattern (CD-ROM, human zygote), but they diverge when instantiated on 
different machines, given different input. Even given very homogenous
instances (say, one C. elegans and another with very similiar neuranatomy,
since genetically determined) they're processing different information, and
representing different environments (e.g. sensing a chemical gradient).

 receives input Y in another. It would then be correct to say
 that the program was enjoying two different scenarios at the
 same time.

No, it's not the same program. You could say the space between your ears and
mine enjoys the same physical laws, though. Both the arrangement of matter
and the state of that matter (frozen-frame picture of spikes and gradients,
gene activity, etc.etc) are very different.

-- 
Eugen* Leitl a href=http://leitl.org;leitl/a
__
ICBM: 48.07100, 11.36820http://www.leitl.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE


signature.asc
Description: Digital signature


Re: More about identity (was Re: Torture yet again)

2005-06-27 Thread Stathis Papaioannou

Eric Cavalcanti writes:


But even in a MWI perspective, they are surely very different
processes, as someone else argued. Tossing a coin does not increase
the number of copies of yourself in the multiverse. Pushing the button
does. There is a symmetry between the two versions of yourself in the
coin tossing scenario. Clearly there is no reason for one to be preferred
to the other, then it is reasonable to believe there is a 50% probability
for you to experience each. But there is an asymmetry between you
and your copies, and there is some reason to believe that your
consciousness cannot experience to be the copies, namely, the
argument that you should not experience anything strange if someone
scans your body without your knowledge.


The argument that the two copies are symmetrical in the MWI and there is no 
reason to choose one over the other is exactly my point about duplication in 
one world. For technical reasons, it generally *would* be possible to 
distinguish a copy from the original if we were using, for example, 
non-destructive teleportation, but surely this is just a detail. You have 
aknowledged that all the atoms in your body change over time, replaced by 
atoms from the environment, and you are still the original you. This is a 
gradual process, although the turnover in the brain is surprisingly fast 
(thanks to Jesse Mazer for that reference). Presumably, you would not be 
worried if the replacement happened all at once rather than gradually. 
Suppose you are standing still, and to your right is a supply of all the 
elements that are found in a human body. When you press a button, each atom 
in your body will move one metre to the left, while at the same time, an 
appropriate atom from the supply to your right will move into the position 
vacated by the atom that has just moved. The result is that there are now 
two copies of you, one metre apart. One copy is in the position you were in 
originally, but is comprised of different atoms. That shouldn't worry you, 
because this happens all the time anyway, so this copy could claim to be the 
original. On the other hand, the copy one metre to the left of where you 
were originally is no different to what would have occurred if you had just 
stepped one metre to the left, so this copy could claim to be the original. 
It seems both have a very good claim to being the original!



p.s.: By the way, this remark in the last message also applies to choice C:
About choice B (and C), it raises other interesting questions: suppose you
know that the copies are going to undergo some sort of plastic surgery a 
week
or so after the experiment, and will look very different from yourself now. 
They
could also undergo some type of slow personality modification (as 
education),

such that they would at any moment agree that they are experiencing a
continuity of identity. Would you still choose B? What if this change
really isn't
slow, but sudden, at the time of creation of the copy? Does it make a
difference? Then what is the difference between doing a copy of yourself or
a copy of someone else, since any two people could be connected by a series
of continuous transformations? Would you still be comforted by the fact 
that

someone, even if very different from you, would be created to replace you?


This question applies without copying as well: if you had plastic surgery, 
then gradual personality change, gradual or sudden memory loss, would you 
still be you or would you be someone else? It illustrates the fact that 
there is no obvious or correct answer when it comes to questions of 
continuity of identity. In the final analysis, the answer has to be 
arbitrary.


--Stathis Papaioannou

_
Dating? Try Lavalife – get 7 days FREE! Sign up NOW. 
http://lavalife9.ninemsn.com.au/clickthru/clickthru.act?context=an99locale=en_AUa=19180




RE: Torture yet again

2005-06-27 Thread Lee Corbin
Eugen writes

  A program can run in two different places at the same time, and
  the program (treated as the pattern) is perfectly capable of
  receiving input X in one location at the same time that it 
 
 No, program is the wrong model. You can have identical pieces of a bit
 pattern (CD-ROM, human zygote), but they diverge when instantiated on 
 different machines, given different input. Even given very homogenous
 instances (say, one C. elegans and another with very similar neuranatomy,
 since genetically determined) they're processing different information, and
 representing different environments (e.g. sensing a chemical gradient).
 
  receives input Y in another. It would then be correct to say
  that the program was enjoying two different scenarios at the
  same time.
 
 No, it's not the same program.

What do you mean?  I am postulating that it *is* the same sequence
of code bytes, the *same* program. Do you know what I mean when
I say that program A is the same program as program B?

It is this same, identical program that is running in two different
places at the same time (pace relativity). Program A at location
one is receiving input X and program A at position two is receiving
input Y. I can't make it any clearer than that.

 You could say the space between your ears and mine enjoys the
 same physical laws, though. Both the arrangement of matter
 and the state of that matter (frozen-frame picture of spikes
 and gradients, gene activity, etc.etc) are very different.

Of course. That's because the Eugen program is quite different
from the Lee program. Now, the Eugen 2004 (March 23, 12:00:00)
program is also somewhat different from the Eugen 2002 program
(March 23, 12:00:00), but they are *very* similar in many, 
many ways. So many ways that we are justified in asserting
that they are for all practical purposes the same person 
(and the same basic program).

Lee

P.S. I had great, great difficulty in understanding anything
that you had to say. I was not able to make most of it out.
Perhaps you could add some redundancy to your tight prose?



RE: Torture yet again

2005-06-26 Thread Lee Corbin
Bruno wrote

 Le 23-juin-05, ? 05:38, Lee Corbin a ?crit :
 
  you *can* be in two places at the same time.
 
 From a third person pov: OK.
 From a first person pov: how?

Right.  From a first person... you cannot be.  This further
illustrates the limitations of the first person account, its
subjectivity, its errors, and its total poverty of thought.

The objective view, which brings us much more into alignment
with what is actually the case, is, as always, the third-person
point of view.

A good historical analogy is this: to really understand the
planets, moons, and sun, it was necessary to totally abandon
the Earth-centric view, and try to see the situation from the
bird's eye view. By remaining fixated with appearances, and
how it looks *from here*, we could never have advanced to the
truth.

It is the same here; if you are interested in knowing what the
case is, and not merely what the appearances are, then you
have to understand that you are a physical process, and it
may so happen that you execute in different places, and in
different times, and that overlaps are possible.

Eugen comments

 You can be in two places at the same time, but you can't
 enjoy two different scenarios, or think individual thoughts.

I disagree.  Again, you slide back and forth between instantiations
and programs, which, as you know, are not the same thing. What you
have written is true of an instance. Were we to be completely
consistent using your terminology, then we would have to say
that you could not think A and then think B, because each instance
of you (in time, this time) cannot think more than one thing.

A program can run in two different places at the same time, and
the program (treated as the pattern) is perfectly capable of
receiving input X in one location at the same time that it 
receives input Y in another. It would then be correct to say
that the program was enjoying two different scenarios at the
same time.

Lee



RE: Torture yet again

2005-06-26 Thread Stathis Papaioannou

Lee Corbin writes:


The objective view, which brings us much more into alignment
with what is actually the case, is, as always, the third-person
point of view.

A good historical analogy is this: to really understand the
planets, moons, and sun, it was necessary to totally abandon
the Earth-centric view, and try to see the situation from the
bird's eye view. By remaining fixated with appearances, and
how it looks *from here*, we could never have advanced to the
truth.

It is the same here; if you are interested in knowing what the
case is, and not merely what the appearances are, then you
have to understand that you are a physical process, and it
may so happen that you execute in different places, and in
different times, and that overlaps are possible.


Certainly, this is the objective truth, and I'm very fond of the objective 
truth. But when we are talking about first person experience, we are not 
necessarily claiming that they provide us with objective knowledge of the 
world; we are only claiming that they provide us with objective knowledge of 
our first person experience. If I say that I have a headache, and my 
duplicate says he doesn't have a headache, who can argue with that? In this 
consists the basis for maintaining that we are two separate people. You say 
later in your post that if I am to be consistent, I would have to say that 
we are two different people when we are separated by time as well as space 
or across parallel universes. What I would say is that my successor tomorrow 
is potentially me if there is continuity of consciousness between all the 
intermediates between now and then. The successor of my duplicate with the 
headache does not satisfy this criterion and is therefore not potentially 
me. Arbitrary though this criterion for continuity of identity may be, it 
is the criterion our minds have evolved with, and calling it irrational will 
not change that fact. If we are to be strictly rational and consistent, it 
is simplest to go to the extreme of saying that *none* of the instantiations 
of an individual are actually the same person, which is another way of 
saying that each observer moment exists only transiently. This would mean 
that we only live for a moment, to be replaced by a copy who only thinks he 
has a past and a future. We die all the time, so death is nothing to worry 
about. I actually believe this extreme view to be closest to the objective 
truth, but I still make plans for the future and I still don't want to 
die in the more usual sense of the word. Being rational is completely 
incapable of making any impact on my biological programming in this case, 
and as you know, there are people in the world who hold being rational in 
much lower esteem than the members of this list do.


--Stathis Papaioannou

_
Express yourself instantly with MSN Messenger! Download today - it's FREE! 
http://messenger.msn.click-url.com/go/onm00200471ave/direct/01/




RE: Torture yet again

2005-06-26 Thread Lee Corbin
Stathis writes

  same here; if you are interested in knowing what the
  case is, and not merely what the appearances are, then you
  have to understand that you are a physical process, and it
  may so happen that you execute in different places, and in
  different times, and that overlaps are possible.
 
 Certainly, this is the objective truth, and I'm very fond of the objective 
 truth. But when we are talking about first person experience, we are not 
 necessarily claiming that they provide us with objective knowledge of the 
 world; we are only claiming that they provide us with objective knowledge of 
 our first person experience.

Objective knowledge of my first person experience, eh?  I'll
have to ponder that one!  Perhaps it will help if I contrast
it with subjective knowledge of my first person experience  :-)

 I [may] have to say that we are two different people when we
 are separated by time as well as space or across parallel
 universes. What I would say is that my successor tomorrow 
 is potentially me if there is continuity of consciousness
 between all the intermediates between now and then.

I'm skeptical of continuity requirements. Now I do not believe in
Greg Egan's equations in Permutation City: according to a premise
of the story, it order to obtain the you of tomorrow, there is a
short-cut alternative to just letting you run.  And that is to
determine the solutions of an immense number of differential
equations that do not in fact emulate your intermediary states.
If this were so, then it may be that you could discontinuously
skip past all of tonight and tomorrow's experiences, and just
start living by directly experiencing the day after that.

It's easy to imagine this being possible; when I was a teen and
was faced with the loathsome task of mowing the lawn, I wondered
if it could be possible for me to just not have that experience
at all, but for my life to just magically resume after the chore
was completed (somehow).  I was aware that what I wanted was not
simply memory erasure.

 The successor of my duplicate with the headache does not satisfy
 this criterion and is therefore not potentially me.

Well, are you sure?  What if he takes a memory-erasure pill
(that works much more perfectly than Midazolam) and thereby
becomes a past state that is identical to one of your
past states, and then evolves forward into states that you
definitely consider to be your natural successors.

After people are uploadable, this could happen without much
fuss all the time. The interplay of and algebraic combinatorial
possibilities of *memory addition*, *experience*, and *memory
erasure* lead back to the notion that one is just a fuzzy set
in the collection of all persons or person-states.

 Arbitrary though this criterion for continuity of identity
 may be, it is the criterion our minds have evolved with,
 and calling it irrational will not change that fact.

Well, some of this is involuntary, but some of it is not.
I've never seen how to shake *anticipation*, for example,
and suppose that we're just stuck with it, problems and
all. But actually I don't have any problem believing that
I *am* my duplicates, even those across the room, who are
just me seeing a different perspective of the room (and
perhaps having slightly different thoughts).

 If we are to be strictly rational and consistent, it 
 is simplest to go to the extreme of saying that *none*
 of the instantiations of an individual are actually the
 same person, which is another way of saying that each
 observer moment exists only transiently. This would mean 
 that we only live for a moment, to be replaced by a copy
 who only thinks he has a past and a future.

Mike Perry, in his book Forever For All develops these
from the idea of day-persons, i.e., the idea that you
are not the same person from day to day. But that's 
certainly not a satisfactory way of extending our usual
notions into these bizarre realms; you and I want to live
next week because we believe that we are the same persons
we'll be then.  And the idea that we *are* fuzzy sets in
person space permits this.

 We die all the time, so death is nothing to worry about.

On this definition, yes. But this is *such* an impractical
approach. We all know that it's bad for your neighbor when
he dies, despite us and him totally believing in the MWI.
We would like to avoid having to say that we die all the
time.

Lee

 I actually believe this extreme view to be closest to the objective 
 truth, but I still make plans for the future and I still don't want to 
 die in the more usual sense of the word. Being rational is completely 
 incapable of making any impact on my biological programming in this case, 
 and as you know, there are people in the world who hold being rational in 
 much lower esteem than the members of this list do.
 
 --Stathis Papaioannou



More about identity (was Re: Torture yet again)

2005-06-24 Thread Eric Cavalcanti
I can see an interesting new problem in this thread. Let me put it in a thought
experiment as the praxis in this list requires.

You are in the same torture room as before, but now the guy is going to 
torture you to death. You have three options:

A: you flip a coin to decide whether you are going to be tortured;
B: you press the copy button 100 times;
C: you press the copy button once.

What do the people in this list choose?

For some people, creating copies increases their 1st person probability
of escaping torture. So that at each time they press the button they can
associate with that a 50% probability of escape. These would
choose B, since then they would have a very near certainty of escaping
torture.

For others,  creating copies does not increase any such probability, and
there is ultimately no meaning in talking about 1st person probability.
But for some reason they seem to feel a strong connection with the
copies, as if they are all the same person. They think it is just as
good to offer a good meal to the copies as it is to offer it for 
themselves. These people should choose C, since in this case they will
be comforted by the fact that a copy of themselves would survive and have
a good life. They don't really need more than one. Actually, one is much
better than many, since they wouldn't have the legal and financial problems
associated with having lots of copies around.

For others, as myself, creating copies does not increase my 1st person
probability of escaping torture. And differently from Lee, I think it is just as
good to offer a good meal to my copy as it is to offer it to my family and
friends. But it is definitely different from offering it to me.
These people would choose A.

I cannot really understand choice B. Would anyone really choose that or am I
just grossly misunderstanding some opinions in this list?

About choice B, it raises other interesting questions: suppose you know that
the copies are going to undergo some sort of plastic surgery a week or so
after the experiment, and will look very different from yourself now. They could
also undergo some type of slow personality modification (as education), such
that they would at any moment agree that they are experiencing a continuity
of identity. Would you still choose B? What if this change really isn't slow,
but sudden, at the time of creation of the copy? Does it make a difference?
Then what is the difference between doing a copy of yourself or a copy of
someone else, since any two people could be connected by a series of
continuous transformations? Would you still be comforted by the fact that
someone, even if very different from you, would be created to replace you?

Eric



RE: More about identity (was Re: Torture yet again)

2005-06-24 Thread Stathis Papaioannou


Eric Cavalcanti writes:


You are in the same torture room as before, but now the guy is going to
torture you to death. You have three options:

A: you flip a coin to decide whether you are going to be tortured;
B: you press the copy button 100 times;
C: you press the copy button once.

What do the people in this list choose?


I would choose B. B is effectively the same as flipping a coin 100 times, 
with a 50% chance of escaping the torture every time. If you say that every 
time the button is pressed not only you are copied, but the entire universe 
is copied sans torture, then this is almost exactly the same as flipping a 
coin 100 times if MWI is true. Why do you think it's OK to be duplicated as 
a result of the universe splitting and not OK to be duplicated by pressing 
a button?


--Stathis Papaioannou

_
Have fun with your mobile! Ringtones, wallpapers, games and more. 
http://fun.mobiledownloads.com.au/191191/index.wl




Re: Torture yet again

2005-06-24 Thread Eugen Leitl
On Fri, Jun 24, 2005 at 05:08:39PM +0200, Bruno Marchal wrote:
 
 Le 23-juin-05, ? 05:38, Lee Corbin a ?crit :
 
 you *can* be
 in two places at the same time.
 
 From a third person pov: OK.
 From a first person pov: how?

You can be in two places at the same time, but you can't enjoy two different
scenaries, or think invidividual thoughts.

It's a degenerate case, and rather uninteresting (but relevant for High
Availability / Failover clusters -- HA, heartbeat, drbd, stonith).

-- 
Eugen* Leitl a href=http://leitl.org;leitl/a
__
ICBM: 48.07100, 11.36820http://www.leitl.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE


signature.asc
Description: Digital signature


Re: Torture yet again

2005-06-24 Thread Bruno Marchal


Le 24-juin-05, à 17:23, Eugen Leitl a écrit :


On Fri, Jun 24, 2005 at 05:08:39PM +0200, Bruno Marchal wrote:


Le 23-juin-05, ? 05:38, Lee Corbin a ?crit :


you *can* be
in two places at the same time.


From a third person pov: OK.
From a first person pov: how?


You can be in two places at the same time, but you can't enjoy two 
different

scenaries, or think invidividual thoughts.



Given the definition of 1-person (the one who enjoys) and the third 
person (the one you can captured by a picture, description, name, 
identity card, etc.), you are just saying what I said, or even better:


From a third person pov: yes.
From a first person pov: no.

(I assume the place Lee talks about *are* 1-person distinguishable, 
of course I am at every undistinguishable places at once, drinking an 
infinity of coffee cup in all Brussels from all Belgiums from all 
Europas (well here I am less sure!) from all earths from all milky ways 
from all universes from all multiverses . from all sets of 
recoverable conceivable computational histories ... in the only one 
arithmetical truth).   [comp assumed!]


Bruno

http://iridia.ulb.ac.be/~marchal/




RE: Torture yet again

2005-06-23 Thread Stathis Papaioannou

Lee Corbin writes:

quote--
[quoting Stathis]

 When you press the button in the torture room, there is a 50%
 chance that your next moment will be in the same room and and
 a 50% chance that it will be somewhere else where you won't be
 tortured. However, this constraint has been added to the
 experiment: suppose you end up the copy still in the torture
 room whenever you press the button. After all, it is certain
 that there will be a copy still in the room, however many
 times the button is pressed. Should this unfortunate person
 choose the coin toss instead?


To me, it's always been a big mistake to employ the language of
probability; you *will* be in the room where the torture is and
you *will* be in the room where it's not, because you *can* be
in two places at the same time.

[quoting Jonathan]

 If he shares your beliefs about identity, then if he changes his mind he
 will be be comitting the gambler's fallacy.

 However, after having pressed the button 100 times and with nothing to 
show
 for it except 100 tortures, his faith that he is a random observer might 
be

 shaken :).


You may want to read a story, The Pit and the Duplicate that I wrote many
years ago, which dwells on the ironies of being duplicates. It's a little
like Stathis's point here. http://www.leecorbin.com/PitAndDuplicate.html
--endquote

Lee's story linked to above is a good summary of the issues. I fundamentally 
disagree with Lee and Hal Finney about the status of copies, because I *do* 
consider that I will only be one person at a time, from a first person 
perspective. If I am going to be more than one person, it would involve a 
special process like telepathy or mind-melding, or something. My criterion 
is that if you stick a pin in someone and I feel it, then that person is me; 
if I don't feel it, then that person isn't me. There is a reasonable line of 
argument that says if copying were widespread, then this criterion would 
change, because people who considered their copies to be as good as self, 
and worked to increase their number and protect their interests, would 
eventually come to predominate. However, it would involve a profound and 
fundamental change in our psychology, so that we would become something like 
hive insects. Basically, when I look at these thought experiments, I assume 
that I am me as I am *now*, serving my own selfish interests as they seem to 
be to me now. If we specify what kind of self-interest is being served in 
discussing these examples, i.e. whether the traditional human type or that 
of some post-human ideal, we can avoid misunderstanding.


Having said that, there is a real paradox in Jonathan's and Lee's thought 
experiment, which does not occur in a single world/ probabilistic cosmology: 
the button-presser will always be the loser. From his point of view, he will 
never escape, but rather is helping others escape. It isn't others before 
he presses the button, but it certainly is others the moment after the 
button is pressed, from the point of the view of the still-and-forever 
button-presser, since as soon as they are created, the duplicates start to 
diverge. Psychologically, it is easy to see how the button-presser could 
decide, against his better judgement, that it is hopeless to keep pressing, 
and this could happen even if the chance of escape per press is raised 
arbitrarily close to certainty. The paradox resolves if, at random, all but 
one of the copies is instantly destroyed the moment they are created, 
because then the button-presser can be assured that if he presses enough 
times, the chance of escape will come arbitrarily close to certainty. I 
suppose this is another situation where *reduction* in total measure can 
actually be a positive - and this time without even any relative reduvction, 
on average, of adverse outcomes.


--Stathis Papaioannou

_
Express yourself instantly with MSN Messenger! Download today - it's FREE! 
http://messenger.msn.click-url.com/go/onm00200471ave/direct/01/




RE: Torture yet again

2005-06-22 Thread Stathis Papaioannou

Jonathan Colvin writes:


You are sitting in a room, with a not very nice man.

He gives you two options.

1) He'll toss a coin. Heads he tortures you, tails he doesn't.

2) He's going to start torturing you a minute from now. In the meantime, he
shows you a button. If you press it, you will get scanned, and a copy of 
you

will be created in a distant town. You've got a minute to press that button
as often as you can, and then you are getting tortured.

What are you going to choose (Stathis and Bruno)? Are you *really* going to
choose (2), and start pressing that button frantically? Do you really think
it will make any difference?

I'm just imagining having pressed that button a hundred times. Each time I
press it, nothing seems to happen. Meanwhile, the torturer is making his
knife nice and dull, and his smile grows ever wider.

Cr^%^p, I'm definitely choosing (1).

Ok, sure, each time I press it, I also step out of a booth in Moscow,
relieved to be pain-free (shortly to be followed by a second me, then a
third, each one successively more relieved.) But I'm still choosing (1).

Now, the funny thing is, if you replace torture by getting shot in the
head, then I will pick (2). That's interesting, isn't it?


This is a good question. It reminds me of what patients sometimes say when 
their doctor confidently explains that the proposed treatment has only a one 
in a million risk of some terrible complication: yes, but what if I'm that 
one in a million? In a multiverse model of the universe, the patient *will* 
be that one in a million, in one millionth of the parallel worlds. This 
means you can arrange experiments so that the copies generated on the basis 
of an unlikely outcome are segregated, making it seem to this subset that 
the improbable is probable or, as in the above example, the contingent is 
certain.


When you press the button in the torture room, there is a 50% chance that 
your next moment will be in the same room and and a 50% chance that it will 
be somewhere else where you won't be tortured. However, this constraint has 
been added to the experiment: suppose you end up the copy still in the 
torture room whenever you press the button. After all, it is certain that 
there will be a copy still in the room, however many times the button is 
pressed. Should this unfortunate person choose the coin toss instead?


Say you do choose the coin option, and let's allow that you can toss the 
coin as many times as you want in the minute you have before the torture 
starts. If the MWI is true, in half of the subsequent worlds the coin comes 
up heads and the version of you in these worlds can still expect torture; 
while in the other half, the coin comes up tails and the torturer lets you 
go. Now, let's add this constraint: suppose that you are the copy for whom 
the coin always comes up heads, however many times you toss it. After all, 
in the MWI it is certain that there will be such a copy, however many times 
the coin is tossed. Should this unfortunate person give up on the coin and 
try begging for mercy while he still has some time left?


Here's another version of the of problem, this time without torture. Suppose 
you have the opportunity to use a machine which, when you put $2 in a slot, 
will destructively analyse you and create 10 copies. Of these copies, 9 will 
each be given $1 million in cash, while the 10th copy will get nothing other 
than another opportunity to use a similar machine. Suppose you are the copy 
who keeps putting coins into the machines and not winning anything. How long 
will it be before you decide you are wasting your money?


What these examples all have in common is that the unlucky copies are 
singled out and, ironically, it is these copies who have control over the 
process (button, coin) which results in their bad luck. If the experiments 
were changed so that, in the copying process, only one randomly chosen copy 
were actually implemented, the apparent probabilities would remain the same 
but it would not be possible to separate out an unlucky group, and the best 
choice would not be problematic. This is how probabilities work in a single 
world model, and our minds have evolved to assume that we live in such a 
world.


--Stathis Papaioannou

_
Don’t just search. Find. Check out the new MSN Search! 
http://search.msn.click-url.com/go/onm00200636ave/direct/01/




RE: Torture yet again

2005-06-22 Thread Hal Finney
Jesse Mazer wrote:

Suppose there had already been a copy made, and the two of you 
were sitting side-by-side, with the torturer giving you the 
following options:

A. He will flip a coin, and one of you two will get tortured 
B. He points to you and says I'm definitely going to torture 
the guy sitting there, but while I'm sharpening my knives he 
can press a button that makes additional copies of him as many 
times as he can.

Would this change your decision in any way? What if you are 
the copy in this scenario, with a clear memory of having been 
the original earlier but then pressing a button and finding 
yourself suddenly standing in the copying chamber--would that 
make you more likely to choose B?

I think this variation points to the major flaw in this thought
experiment, which is the implicit assumption that copying is possible yet
is not used.  In fact, if copying is possible as the thought experiment
stipulates, it would tend to be widely used.  The world would be full of
people who are copies.  You would be likely to be an nth-generation copy.
There would be no novelty as Jesse's variation suggests in allowing you
to experience (presumably for the first time!) being copied.

I keep harping on this because copying increases measure.  It is different
from flipping a coin, which does not increase measure.  Your expectations
going into a copy are different.  To the extent that this language makes
sense, I would say that you have a 100% chance of becoming the copy and
a 100% chance of remaining the original.  This is different from flipping
a coin.

You may think that it would feel the same way, but you've never tried it.
Fundamentally, our perception of the world, our phenomenology, our sense
of identity and our concept of future and past selves are not intrinsic,
but are useful tools which have *evolved* to allow our minds to achieve
the goals of survival and reproduction.  In a world where copying
is possible, we would evolve different ways of perceiving the world.
I believe that in such a world, we would perceive the aftermath of copying
very differently than the aftermath of flipping a coin.  The effects
are different, the evolutionary and survival implications are different.

In the world of this thought experiment, if the additional copies are
(via special dispensation) going to be treated well and given a good
chance to survive and thrive, then yes, most people would press the
button like crazy.  It's just like today, if a bachelor were given
the opportunity to have sex with a dozen beautiful women, he'd jump
at the chance.  It's not because of any intrinsic value in the act,
it's because evolution has programmed him to take this opportunity to
increase the measure of his genes.  In the same way, pressing the button
would increase the measure of your mind, and it would be equally as
rewarding.

In the spirit of this list, let me offer my own variation.  It is like
the original, except instead of torture you are offered a 50-50 chance
to experience a delicious meal prepared by an expert chef.  Or you can
press the button to make some copies, in which case you get a 100% chance
of having the meal.  For me, pressing the button is a win-win situation,
assuming the copies will be OK.  I certainly don't think that pressing
the button reduces the measure of my enjoyment of the food.

Hal Finney



RE: Torture yet again

2005-06-22 Thread Jonathan Colvin
Stathis wrote: 
When you press the button in the torture room, there is a 50% 
chance that your next moment will be in the same room and and 
a 50% chance that it will be somewhere else where you won't be 
tortured. However, this constraint has been added to the 
experiment: suppose you end up the copy still in the torture 
room whenever you press the button. After all, it is certain 
that there will be a copy still in the room, however many 
times the button is pressed. Should this unfortunate person 
choose the coin toss instead?

If he shares your beliefs about identity, then if he changes his mind he
will be be comitting the gambler's fallacy.

However, after having pressed the button 100 times and with nothing to show
for it except 100 tortures, his faith that he is a random observer might be
shaken :).

Jonathan Colvin



RE: Torture yet again

2005-06-22 Thread Stathis Papaioannou

Jonathan Colvin writes:


Stathis wrote:
When you press the button in the torture room, there is a 50%
chance that your next moment will be in the same room and and
a 50% chance that it will be somewhere else where you won't be
tortured. However, this constraint has been added to the
experiment: suppose you end up the copy still in the torture
room whenever you press the button. After all, it is certain
that there will be a copy still in the room, however many
times the button is pressed. Should this unfortunate person
choose the coin toss instead?

If he shares your beliefs about identity, then if he changes his mind he
will be be comitting the gambler's fallacy.

However, after having pressed the button 100 times and with nothing to show
for it except 100 tortures, his faith that he is a random observer might be
shaken :).


Yes, but do you agree it is the same for any probabilistic experiment in a 
many worlds cosmology? If you sit down and toss a coin 100 times in a row, 
there will definitely be one version of you who has obtained 100 heads in a 
row, just as there will definitely be one version of you (the one still in 
the torture room) who has nothing to show after pushing the button 100 
times.


--Stathis Papaioannou

_
Single? Start dating at Lavalife. Try our 7 day FREE trial! 
http://lavalife9.ninemsn.com.au/clickthru/clickthru.act?context=an99locale=en_AUa=19179




RE: Torture yet again

2005-06-22 Thread Lee Corbin
Hi everyone,

I've been in heated discussions about duplicates for 39 years now,
and so I just don't have much patience with it any more.

I have not read many of the recent posts, but I have always gone
along with the viewpoint that more runtime is good, and that
it linearly bestows benefit on one.

I do notice this email:

Jonathan Colvin writes:

Stathis wrote:
  When you press the button in the torture room, there is a 50%
  chance that your next moment will be in the same room and and
  a 50% chance that it will be somewhere else where you won't be
  tortured. However, this constraint has been added to the
  experiment: suppose you end up the copy still in the torture
  room whenever you press the button. After all, it is certain
  that there will be a copy still in the room, however many
  times the button is pressed. Should this unfortunate person
  choose the coin toss instead?

To me, it's always been a big mistake to employ the language of
probability; you *will* be in the room where the torture is and
you *will* be in the room where it's not, because you *can* be
in two places at the same time.

  If he shares your beliefs about identity, then if he changes his mind he
  will be be comitting the gambler's fallacy.
 
  However, after having pressed the button 100 times and with nothing to show
  for it except 100 tortures, his faith that he is a random observer might be
  shaken :).

You may want to read a story, The Pit and the Duplicate that I wrote many
years ago, which dwells on the ironies of being duplicates. It's a little
like Stathis's point here. http://www.leecorbin.com/PitAndDuplicate.html

Lee



RE: Torture yet again

2005-06-22 Thread Jonathan Colvin
 
 Stathis wrote:
  When you press the button in the torture room, there is a 
 50% chance 
  that your next moment will be in the same room and and a 
 50% chance 
  that it will be somewhere else where you won't be 
 tortured. However, 
  this constraint has been added to the
  experiment: suppose you end up the copy still in the torture room 
  whenever you press the button. After all, it is certain that there 
  will be a copy still in the room, however many times the button is 
  pressed. Should this unfortunate person choose the coin 
 toss instead?
 
 If he shares your beliefs about identity, then if he changes 
 his mind 
 he will be be comitting the gambler's fallacy.
 
 However, after having pressed the button 100 times and with 
 nothing to 
 show for it except 100 tortures, his faith that he is a 
 random observer 
 might be shaken :).
 
 Yes, but do you agree it is the same for any probabilistic 
 experiment in a many worlds cosmology? If you sit down and 
 toss a coin 100 times in a row, there will definitely be one 
 version of you who has obtained 100 heads in a row, just as 
 there will definitely be one version of you (the one still in 
 the torture room) who has nothing to show after pushing the 
 button 100 times.

Yes, I agree. There are always going to be an unfortunate few.

I think I know where this is going; if manyworlds is correct, there will be
10sup100 copies of me created in the next instant to which nothing bad
happens, and a much smaller measure to whom something nasty happens, quite
by chance. Presumably if I choose 50% over 10 copies, I should also choose
50% over 10sup100 copies, so if given the option between the status quo
(assuming manyworlds) and a seemingly much higher chance of something nasty
happening, I should choose the higher chance of nastiness (if I'm being
consistent). 

There's not much answer to that; probably if I was convinced that manyworlds
is correct, and something nasty *is* bound to happen to a small number of me
in the next instant, I *would* choose the copies. In our thought experiment
the subject knows he's getting tortured; unless we can prove manyworlds the
nastiness is only conjecture.

If that wasn't where you were heading, forgive the presumption... :)

Jonathan Colvin



RE: Torture yet again

2005-06-22 Thread Jonathan Colvin
I (Jonathan Colvin) wrote:
   When you press the button in the torture room, there is a
  50% chance
   that your next moment will be in the same room and and a
  50% chance
   that it will be somewhere else where you won't be
  tortured. However,
   this constraint has been added to the
   experiment: suppose you end up the copy still in the 
 torture room 
   whenever you press the button. After all, it is certain 
 that there 
   will be a copy still in the room, however many times the 
 button is 
   pressed. Should this unfortunate person choose the coin
  toss instead?
  
  If he shares your beliefs about identity, then if he changes
  his mind
  he will be be comitting the gambler's fallacy.
  
  However, after having pressed the button 100 times and with
  nothing to
  show for it except 100 tortures, his faith that he is a
  random observer
  might be shaken :).
  
  Yes, but do you agree it is the same for any probabilistic 
 experiment 
  in a many worlds cosmology? If you sit down and toss a coin 
 100 times 
  in a row, there will definitely be one version of you who 
 has obtained 
  100 heads in a row, just as there will definitely be one version of 
  you (the one still in the torture room) who has nothing to 
 show after 
  pushing the button 100 times.
 
 Yes, I agree. There are always going to be an unfortunate few.
 
 I think I know where this is going; if manyworlds is correct, 
 there will be 10sup100 copies of me created in the next 
 instant to which nothing bad happens, and a much smaller 
 measure to whom something nasty happens, quite by chance. 
 Presumably if I choose 50% over 10 copies, I should also 
 choose 50% over 10sup100 copies, so if given the option 
 between the status quo (assuming manyworlds) and a seemingly 
 much higher chance of something nasty happening, I should 
 choose the higher chance of nastiness (if I'm being consistent). 
 
 There's not much answer to that; probably if I was convinced 
 that manyworlds is correct, and something nasty *is* bound to 
 happen to a small number of me in the next instant, I *would* 
 choose the copies. In our thought experiment the subject 
 knows he's getting tortured; unless we can prove manyworlds 
 the nastiness is only conjecture.
 
 If that wasn't where you were heading, forgive the presumption... :)

Ok, you've convinced me (or did I convince myself?). I've joined the ranks
of the button pushers (with large number of copies anyway). But the
probabilities seem to make a difference. For instance if there's a 50%
chance of torture vs. 3 copies with one getting tortured for sure, I'll
still choose the 50%. Don't ask me at which number of copies I'll start
pushing the button; I dunno.

Jonathan Colvin 



Re: Torture yet again

2005-06-21 Thread Eugen Leitl
On Tue, Jun 21, 2005 at 04:05:02AM -0700, Jonathan Colvin wrote:

 Now, the funny thing is, if you replace torture by getting shot in the
 head, then I will pick (2). That's interesting, isn't it?

Why is that interesting? It's indistinguishable from a teleportation
scenario.

-- 
Eugen* Leitl a href=http://leitl.org;leitl/a
__
ICBM: 48.07100, 11.36820http://www.leitl.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE


signature.asc
Description: Digital signature


Re: Torture yet again

2005-06-21 Thread Hal Finney
Jonathan Colvin writes:
 You are sitting in a room, with a not very nice man.

 He gives you two options.

 1) He'll toss a coin. Heads he tortures you, tails he doesn't.

 2) He's going to start torturing you a minute from now. In the meantime, he
 shows you a button. If you press it, you will get scanned, and a copy of you
 will be created in a distant town. You've got a minute to press that button
 as often as you can, and then you are getting tortured.

I understand that you are trying to challenge this notion of subjective
probability with copies.  I agree that it is problematic.  IMO it is
different to make a copy than to flip a coin -  different operationally,
and different philosophically.

What you need to do is to back down from subjective probabilities and
just ask it like this: which do you like better, a universe where there
is one of you who has a 50-50 chance of being tortured; or a universe
where there are a whole lot of you and one of them will be tortured?
Try not to think about which one you will be.  You will be all of them.
Think instead about the longer term: which universe will best serve your
needs and desires?

There is an inherent inconsistency in this kind of thought experiment
if it implicitly assumes that copying technology is cheap, easy and
widely available, and that copies have good lives.  If that were the
case, everyone would use it until there were so many copies that these
properties would no longer be true.

It is important in such experiments to set up the social background in
which the copies will exist.  What will their lives be like, good or
bad?  If copies have good lives, then copying is normally unavailable.
In that case, the chance to make copies in this experiment may be a
once-in-a-lifetime opportunity.  That might well make you be willing to
accept torture of a person you view as a future self, in exchange for
the opportunity to so greatly increase your measure.

OTOH if copying is common and most people don't do it because the future
copies will be penniless and starve to death, then making copies in this
experiment is of little value and you would not accept the greater chance
of torture.

This analysis is all based on the assumption that copies increase measure,
and that in such a world, observers will be trained that increasing
measure is good, just as our genes quickly learned that lesson in a
world where they can be copied.

Hal Finney



Re: Torture yet again

2005-06-21 Thread Bruno Marchal


Le 21-juin-05, à 13:05, Jonathan Colvin a écrit :



Sorry, I can't let go of this one. I'm trying to understand it
psychologically.

Here's another thought experiment which is roughly equivalent to our
original scenario.

You are sitting in a room, with a not very nice man.

He gives you two options.

1) He'll toss a coin. Heads he tortures you, tails he doesn't.

2) He's going to start torturing you a minute from now. In the 
meantime, he
shows you a button. If you press it, you will get scanned, and a copy 
of you
will be created in a distant town. You've got a minute to press that 
button

as often as you can, and then you are getting tortured.

What are you going to choose (Stathis and Bruno)? Are you *really* 
going to
choose (2), and start pressing that button frantically? Do you really 
think

it will make any difference?



I will choose 2, and most probably start pressing the button 
frantically.  Let us imagine that I press on the button 64 times.
The one who will be tortured is rather unlucky, he has 1/2^64 chance to 
stay in front of you.  He will probably even infer the falsity of 
comp, but then you will kill him!
The 63 other brunos will infer comp is true, and send 63 more 
arguments for it to the list, including the argument based on having 
survive your experiment!


OK with the number?

Bruno



http://iridia.ulb.ac.be/~marchal/