Le 13-juin-05, à 15:39, Jesse Mazer a écrit :

Bruno Marchal:To Jesse: You apparently completely separate the probability of x andx' from the similarity of x and x'.I am not sure that makes sense for me.In particular how could x and x' be similar, if x', but not x,involves a 'white rabbit events'.It's not completely separable, but I'd think that "similarity" wouldmostly be a function of memories, personality, etc...even if Iexperience something very weird, I can still have basically the samemind. For example, a hoaxer could create a realistic animatronictalking white rabbit, and temporarily I might experience anobserver-moment identical to what I'd experience if I saw a genuinewhite talking rabbit, so the "similarity" between my currentexperience and what I'd experience in a white rabbit universe would bethe same as the "similarity" between my current experience and whatI'd experience in a universe where someone creates a realistic hoax. Idon't think the first-person probabilities of experiencing hoaxes aresomehow kept lower than what you'd expect from a third-personperspective, do you?

`Perhaps I misunderstood you, but it seems to me, that in case you ask`

`me to compute P(x -> y) (your notation), it could and even should`

`change that prediction result. In particular if the rabbit has been`

`generated by a genuine hoaxer I would predict the white rabbit will`

`stay in y, and if the hoaxer is not genuine, then I would still`

`consider x and x' as rather very dissimilar. What do you think? This`

`follows *also* from a relativisation of Hall Finney's theory based on`

`kolmogorov complexity: a stable white rabbit is expensive in`

`information resource. No?`

Bruno http://iridia.ulb.ac.be/~marchal/