>Why care for the subset of provable sentences?
>Aren't we interested in the full set of all describable sentences?
We are interested in the true sentences. The provable one and the
>We can generate it, without caring for proofs at all.
If you mean generate all describable *sentences* , this is trivial
but then you generate only a language.
If you mean generate all computations, it is equivalent with generating
the proofs of all \Sigma_1 sentences (= the "arithmetical" UD).
The provability predicate is just a universal \Sigma_1 sentence.
(I recall a \Sigma_1 sentence is a sentence equivalent to (Ex)P(x) with
P algorithmicaly decidable).
>"Unspeakable" means "beyond formal definition."
Beyond turing nameability?
Beyond formal definition by whom? You know the universal sound
machine has no mean to define its own truth predicate (Tarski). Is it
in that sense? The knowledge of p by the machine can be define (in a first
approximation) by "p & Bew('p)", in which case, although thoroughly
clear at the metalevel, the knowledge *by* the machine is beyound
any formal definition *for* the machine.
Bew = Godel provability predicate, 'p = the godel number of "p".
Bew(x) = (Ey)B(y,x) with B(y,x) means y is the godel number of a proof
of a sentence with godel number x.
>In particular, the
>way we humans communicate and agree on a common formal language is not
OK. Most importantly the very meaning of "finite" or number, cannot
be formally defined. But I show that as far as we are sound universal
machine, or own knowledgeability "predicate" cannot be formally defined
>You write "x" and I say, ok, it's an "x", but that's
>all based on some sort of informal agreement that involves complex
>pattern recognition and learning processes. Usually we do not question
>the validity of outcomes of such cognitive processes, and just go ahead
>with formal derivations.
Yes, but the situation is worst for the very *meaning* of our
>But there is this major informal step *before*
>formal reasoning can start, and good textbooks on logic acknowledge
There is *this* one, but there is a deeper one with the *meaning*
of our formal presentations. We really bet we share a common intuition
on the notion of finitary procedure.
Note that intuitionist and classicist diverge beyond natural numbers.
We are not on the same lenghtwave Juergen. You believe there
is a computable universe. I am agnostic about that. But if I survive
or (just experience nothing) through a digital functionnal substitution
made at some level L then my experience is defined by the set of all
consistent extensions defined by all consistent histories definissable
below L. With comp realities emerge from the way consistent machine's
memories organise themselves (through the many many histories going
through their states).
Your speed prior would be useful if it shows how it enhances
the weight of normal stories in the limit first person experiences
(due to unawareness of the reconstitution delays).
Note that quantum quantization of classical theories provide such
an explanation. I show comp need to extract that quantization from
a measure on the consistent extensions. I show the "probability one"
obeys the main axioms of quantum logic.
You ask me often why I give so much importance to the notion
of provability. The reason is that in some sense I just interview
scientific machines, as they can appear in the GP's work, about what
they are able to prove about their own consistent extensions, and how
and why they can arrive to probability consisderation. And "provability"
is enough non trivial for providing clues on the minimal necessary
logical structures on that set of consistent extensions.