Hi George,

>Your effort to clarify your position is admirable and I hope other 
>everythingers are following your explanation. You are still too 
>technical for my taste, and I may simply have to give up on 
>understanding your position at a critical and constructive
>level rather than just at a spectator level.

I will try to be less technical in the future. 

>I agree fully with your UDA. To summarixe: Basically the perception
>of consciousness as seen from the first person is continuous, 
>independently of interruptions in time,
>separation in space, (virtual) implementation level, and number of 
>branchings (and number of fusings or mergings... my addition).

Yes. Just to be clear: those fusing and splitting cannot change
the nature of the experiences but of course can change the
degree of possibility/probability of those experiences.

>Yes. Because of UDA, consciousness doesn't seems to be tied down 
>to one particular "Schmidhuberian universe." In fact the environment 
>where consciousness operates seems to be the whole plenitude and I agree
>with you when you expressed reluctance in talking about a "universe." We
>are in the plenitude but our senses can only perceive a thin slice of it.
>More precisely, for each one of us, "I-plural" am in the
>plenitude, but "I-plural" can perceive only a thin slice of it. Each "I" 
>has a different perspective.


>Yes. consistency in the (virtual) linkage that connects the present 
>state with all immediate future states (consistent extensions) is the 
>filter that defines consciousness from non consciousness. A corollary is
>that non-consistent extentions are not conscious. The measure problem 
>is how to quantify the relative number of
>(future) extensions having different attributes.


>Let's use English. []<>t:   prove not consistent?

Here is a set of english translations (where []p means "I prove p").

[]t   I prove my favorite tautology
[]f   I prove FALSE, I prove a contradiction, I am inconsistent
<>t   = -[]-t = -[]f = I don't prove FALSE, I am consistent
[][]t   I prove that I prove t
[]<>t   I prove that I am consistent

Strictly speaking []p is not "I prove p", but "I *can* prove p" or
"it is provable by me that p".

You can always interchange:
 -[]- with <>
-[] with <>-
-<> with []-
-<>- with []

Exercice: Convince yourself that

-[][][][][][][][][][][]p is the same as

>Ok. Physics is pattern of laws perceived by the consciousness observing 
>the plenitude. The consistency filter that restricts consciousness is the 
>same filter that restrict the world that consciousness observes. This is 
>why the world is understandable and this is why there are no white rabbits.
>White rabbits are not consistent.

Unfortunately I don't think this is true. The problem with the white
rabbits is that there are consistent! For exemple we can dream of
white rabbits. Later, when I will explain Godel's result we will see
the consistency of inconsistency!!!!

Oh well, I can explain it now:

As I said the first theorem in the "psychology of machine" is Godel's
second incompleteness theorem. I recall it first in english:

If I am consistent then I cannot prove that I am consistent.

Now "I am consistent" is the same as "I cannot prove f" (f = FALSE), or
"I cannot prove -t". OK.

in modal notation the second theorem of Godel can be written:

(-[]-t) -> (-[]-[]-t)     or   <>t -> -[]<>t  (cf <> = -[]-)

You can read -[]A as "I cannot prove A". This is the same as "-A is
consistent" (because if -A were not consistent it would mean that
from -A you can get f, and A would be provable by reduction ad absurdum).

So the unprovability of consistency entails the consistency of the
provability of the false, that is the consistency of inconsistency.

Don't confuse inconsistency   -<>t    (-[]-t)
and the consistency of inconsistency  <>-<>t.     (-[]<>t)

It is important because that is why consistent machines can be wrong,
or can be dreaming or ... can (without losing consistency) see White
Rabbits. The only thing we can hope to show is that white rabbit are
relatively rare.

>I am not very familiar with the concept of G*.

No problem. I will make a special post on G and G*.

>The movies "The Matrix" and "Tron" also deserve a mention

I agree with "Tron". Concerning Matrix I would just mention that it
describe a non computationalist civilisation. The virtual world is still
the product of biological neurons. The people have still a organic
body ... It just describes shared biological dreams.

I will discussed your other comments later.


Reply via email to