Oops sorry. I did misunderstood you. Thanks for the clarification. I
agree with your preceding post to Hal now.
Le 13-juin-05, à 16:23, Jesse Mazer a écrit :
Bruno Marchal wrote:
To Jesse: You apparently completely separate the probability of x
and x' from the similarity of x and x'.
I am not sure that makes sense for me.
In particular how could x and x' be similar, if x', but not x,
involves a 'white rabbit events'.
It's not completely separable, but I'd think that "similarity" would
mostly be a function of memories, personality, etc...even if I
experience something very weird, I can still have basically the same
mind. For example, a hoaxer could create a realistic animatronic
talking white rabbit, and temporarily I might experience an
observer-moment identical to what I'd experience if I saw a genuine
white talking rabbit, so the "similarity" between my current
experience and what I'd experience in a white rabbit universe would
be the same as the "similarity" between my current experience and
what I'd experience in a universe where someone creates a realistic
hoax. I don't think the first-person probabilities of experiencing
hoaxes are somehow kept lower than what you'd expect from a
third-person perspective, do you?
Perhaps I misunderstood you, but it seems to me, that in case you ask
me to compute P(x -> y) (your notation), it could and even should
change that prediction result. In particular if the rabbit has been
generated by a genuine hoaxer I would predict the white rabbit will
stay in y, and if the hoaxer is not genuine, then I would still
consider x and x' as rather very dissimilar. What do you think? This
follows *also* from a relativisation of Hall Finney's theory based on
kolmogorov complexity: a stable white rabbit is expensive in
information resource. No?
Well, note that following Hal's notation, I was actually assuming y
came before x (or x'), and I was calculating P(y -> x). And your
terminology is confusing to me here--when you say "if the hoaxer is
not genuine", do you mean that the white rabbit wasn't a hoax but was
a genuine talking rabbit (in which case no hoaxer is involved at all),
or do you mean if the white rabbit *was* a hoax? If the latter, then
what do you mean when you say "if the rabbit had been generated by a
genuine hoaxer"--is the white rabbit real, or is it a hoax in this
case? Also, when you say you'd consider x and x' as very dissimilar,
do you mean from each other or from y? Remember that "dissimilar" is
just the word I use for continuity of personal identity, how much two
successive experiences make sense as being successive OMs of the "same
person", it doesn't refer to whether the two sensory experiences are
themselves dissimilar or dissimilar. If I'm here looking at my
computer, then suddenly close my eyes, the two successive experiences
will be quite dissimilar in terms of the sensory information I'm
taking in, but they'll still be similar in terms of my background
memories, personality, etc., so they make sense as successive OMs of
the same person. On the other hand, if I'm sitting at my computer and
suddenly my brain is replaced with the brain of George W. Bush, there
will be very little continuity of identity despite the fact that the
sensory experiences of both OMs would be pretty similar, so in my
terminology there would be very little "similarity" between these two
As for the cost of simulating a white rabbit universe, I agree it's
more expensive than simulating a non-white rabbit universe, but I
don't see how this relates to continuity of identity when experiencing
white rabbits vs. not experiencing them.