Stathis Papaioannou wrote:

Stathis Papaioannou writes:
> QM or QTI do not imply
> that you can never lose consciousness. The idea is that you can never
> *experience* loss of consciousness. You can fall asleep, but when you wake
> up, you don't remember being asleep. If you never wake up - i.e. if you die
> in your sleep - then you never experience that particular branch of the MW.
> In other words, you can only experience those worlds where the loss of
> consciousness is temporary.

How about impairment of consciousness?  Can you experience that?  Can you
experience going crazy, or having a reduced level of consciousness where
you are drugged or barely alive?  That's how death is for most people,
it's not like flicking off a light.  Will Quantum Immortality protect you
from spending an eternity in a near-coma?  Exactly how much consciousness
does it guarantee you?

Hal Finney

Alas, you are right. Immortality is not all fun and games, and in some worlds you may experience a drawn out fizzling out, reduced to the consciousness of an infant, then a fish, then an amoeba. I believe Max Tegmark aknowledged this in a commentary on his original paper. If you're really unlucky, you will experience eternal torment in the flames of hell. And unlike the Christian Hell, you don't actually have to do something wrong to end up in QTI hell: it all depends on the fall of the cosmic dice.

One question which comes up is, when do you stop being you? I suppose this is an answer to your "how much consciousness is guaranteed" question: when you lose enough consciousness that you forget who you are, that is the cutoff where you can really be said to have lost consciousness.

I think that's too handwavey--I think that to really have a satisfying answer to this question, you need some kind of formal theory of consciousness that answers questions like, "If I am currently experiencing observer-moment A, what is the probability that my next experience will of observer-moment B vs. observer-moment C"? I think the answer should depend both on some sort of measure of the "similarity" of A and B vs. A and C (to deal with the 'when do you stop being you' question), and also on some notion of the absolute probability of B vs. C (for example, if B and C are both equally 'similar' to your current experience A, but B is experiencing some kind of thermodynamic miracle while C is experiencing business as usual, then C would be more likely). I elaborated on these ideas in my posts in the "Request for a glossary of acronyms" thread at


Reply via email to