Le 12-juin-05, à 06:30, Jesse Mazer a écrit :

My speculation is that p(y -> x) would depend on a combination of some function that depends only on intrinsic features of the description of x and y--how "similar" x is to y, basically, the details to be determined by some formal "theory of consciousness" (or 'theory of observer-moments', perhaps)--and the absolute probability of x, since if two possible future OMs x and x' are equally "similar" to my current OM y, then I'd expect if x had a higher abolute measure than x' (perhaps x' involves an experience of a 'white rabbit' event), then p(y -> x) would be larger than p(y -> x').

To Jesse: You apparently completely separate the probability of x and x' from the similarity of x and x'.
I am not sure that makes sense for me.
In particular how could x and x' be similar, if x', but not x, involves a 'white rabbit events'.

To Hal: I don't understand how an OM could write a letter. Writing a letter, it seems to me, involves many OMs. Evolution, and more generally "computational history" *is* what gives sense to any of the OM. What evolves is more "real" that the many (subjective or objective) bricks on which evolution proceeds and histories locally rely.

To Russell: I don't understand what you mean by a "conscious description". Even the expression "conscious" machine can be misleading at some point in the reasoning. It is really some person, which can be (with comp) associate relatively to a machine/machine-history, who can be conscious. Imo, only a person can be conscious. Even the notion of OM, as it is used in most of the recent posts, seems to me be a construction of the mind of some person. It is personhood which makes possible to attribute some sense to our many living 1-person OMs.



Reply via email to