On 07 Aug 2011, at 15:50, benjayk wrote:

Bruno Marchal wrote:

On 06 Aug 2011, at 23:14, benjayk wrote:

Frankly I am a bit tired of this debate (to some extent debating in
so I will not respond in detail any time soon (if at all). Don't
take it as
total disinterest, I found our exchange very interesting, I am just
not in
the mood at the moment to discuss complex topics at length.

There is no problem, Ben. I hope you will not mind if I comment your

Of course not, I am interested in your comments. I just wanted to make clear
why I responded briefly.

OK. Thanks for letting me know. I have to brief also, because I am overwhelmed by summer work. I enjoy very much your attempt to understand what I try to convey.

Bruno Marchal wrote:

Bruno Marchal wrote:

Then computer science provides a theory of consciousness, and
explains how
consciousness emerges from numbers,
How can consciousness be shown to emerge from numbers when it is
assumed at the start?

In science we assume at some meta-level what we try to explain at some
level. We have to assume the existence of the moon to try theories
about its origin.
That's true, but I think this is a different case. The moon seems to have a past, so it makes sense to say it emerged from its constituent parts. In the
past, it was already there as a possibility.

OK, I should say that it emerges arithmetically. I thought you did already understand that time is not primitive at all. More on this below.

But consciousness as such has no past, so what would it mean that it emerges from numbers? Emerging is something taking place within time. Otherwise we are just saying we can deduce it from a theory, but this in and of itself
doesn't mean that what is derived is prior to what it is derived from.

To the contrary, what we call numbers just emerges after consciousness has been there for quite a while. You might argue that they were there before, but I don't see any evidence for it. What the numbers describe was there
before, this is certainly true (or you could say there were implicitly

OK. That would be a real disagreement. I just assume that the arithmetical relations are true independently of anything. For example I consider the truth of Goldbach conjecture as already settled in Platonia. Either it is true that all even number bigger than 2 are the sum of two primes, or that this is not true, and this independently on any consideration on time, spaces, humans, etc. Humans can easily verify this for little even numbers: 4 = 2+2, 6 = 3+3, 8 = 3+5, etc. But we don't have found a proof of this, despite many people have searched for it. I can see that the expression of such a statement needs humans or some thinking entity, but I don't see how the fact itself would depend on anything (but the definitions).

Bruno Marchal wrote:

It's a bit like assuming A, and because B->A is true if A is true,
we can
claim for any B that B is the reason that A true.

This confirms you are confusing two levels. The level of deduction in
a theory, and the level of implication in formal logic.
I am not saying it's the same. I just don't see that because we can formally
deduce A from B, this mean that A in reality emerges from B.

What I say is more subtle. I will make an attempt to be clearer below.

Bruno Marchal wrote:

Consciousness is simply a given. Every "explanation" of it will just
what it is and will not determine its origin, as its origin would
need to be
independent of it / prior to it, but could never be known to be
prior to it,
as this would already require consciousness.

In the comp theory it can be explained why machine takes consciousness
as a given, and that from their first person points of view, they are
completely correct about this.

Bruno Marchal wrote:

Yet, consciousness is not assumed as
something primitive in the TOE itself.
But this doesn't really matter, as we already assume that it's primitive,
because we use it before we can even formulate anything.

We already assumed it exists, sure. But why would that imply that it exists primitively? It exist fundamentally: in the sense that once you have all the true arithmetical relation, consciousness exists. So, consciousness is not something which appears or emerges in time or space, but it is not primitive in the sense that its existence is a logical consequence of arithmetical truth (provably so when we assume comp and accept some definition).

Sometimes I sketch this in the following manner. The arrows are logico- arithmetical deduction:


You can't just
ignore what you already know, by not making your assumptions explicit in
your theory.

It is just not an assumption in the theory, but a derived existence. With comp, consciousness is implicit in the arithmetical truth.

Bruno Marchal wrote:

Bruno Marchal wrote:

Bruno Marchal wrote:

Bruno Marchal wrote:

And in that sense, comp provides, I think, the first coherent
picture of
almost everything, from God (oops!) to qualia, quanta included,
this by assuming only seven arithmetical axioms.
I tend to agree. But it's coherent picture of everything includes
possibility of infinitely many more powerful theories.
it may
be possible to represent every such theory with arithmetic - but
then we can
represent every arithmetical statement with just one symbol and an
scheme, still we wouldn't call "." a theory of everything.
So it's not THE theory of everything, but *a* theory of

Not really. Once you assume comp, the numbers (or equivalent) are
enough, and very simple (despite mysterious).
They are enough, but they are not the only way to make a theory of
everything. As you say, we can use everything as powerful as
numbers, so
there is an infinity of different formulations of theories of

For any theory, you have infinities of equivalent formulations. This
is not a defect. What is amazing is that they can be very different
(like cellular automata, LISP, addition+multiplication on natural
numbers, quantum topology, billiard balls, etc.
I agree. It's just that in my view the fact that they can be very
makes them ultimately different theories, only theories about the same

And proving the same things, with equivalent explanation.
Sure, we can write indistinguishable programs (to the user) with different
programming languages as well. Still they are different programming
languages, and they are only equivalent with respect to what they can
compute, not at all practically.


Bruno Marchal wrote:

Different theories may explain the same thing, but in practice, they
may vary in their efficiency to explain it, so it makes sense to
treat them
as different theories.

But the goal here is a conceptual understanding, not direct practical
OK, but even in this case it seems that this is easier in some languages
than in others.

And even number theology is far easier to explain with the combinators than with the numbers, which explain whay I have made attempts to explain the combinators in the list. But people are more familiar with the numbers, and there is a sort of elegance to do it with the simplest theory on which everybody agree (except the sunday philosopher :)

Bruno Marchal wrote:

In theory, even one symbol can represent every statement in any

That does not make sense for me. (or it is trivia).
Yes, it is trivial. We just encode statements with numbers expressed with
one symbol (eg + is I, 1 is II,...).

How will you encode with one symbol the statement 1+1=2? I think you will need two symbols, unless you fix a bijection between a language and the natural numbers. But you will need a richer language to describe that bijection. I will not insist because encoding technic are not relevant with the conceptual point.

Bruno Marchal wrote:

but still it's not as powerful as the language it represents.

Similarily if you use just natural numbers as a TOE, you won't be
able to
directly express important concepts like dimensionality.

Why? If you prove this, I abandon comp immediately.
Hm, how do you express the point (3,4) on a two-dimensional plane with
natural numbers?

I might use a Gödel-like coding for the string "(s(s(s(0))), s(s(s(s(0)))))", like coding "(" by 2, "s" by 3, "0" by 4 and ")" by 5, and then the string itself, using the prime numbers, by 2^2 * 3^3 * 5^2 * 7^3 * etc. That is each prime number exponent the code of the particular symbol. Or something like that, where I can code an axiomatic of the plane by a number too, etc.

It seems we have to interpret the numbers in a certain way
to do this, and can't express it directly. If we used gaussian integers we
could simply describe the point as 3+4i.

That's OK, but 3+4i can itself be coded, by 2^(code of 3)*3^(code of +)*5^(code of 4) *7^(code of i).

But, really, a notion like the plane will be encoded in the brain of the observers, and the observers are described by the arithmetic relations participating in the computations leading to that observer state, which exists in arithmetic (once you assume that you are a digitalisable machine).

Bruno Marchal wrote:

From comp you can
derive the whole of physics, and this should be easy to understand if
you get the UDA1-7.
Well, I get that if we accept COMP we need to associate sheafs of
computations to mind-states, but I have no clue how natural numbers can be
used to derive physics, or even formulate anything related to physics,
without using a meta-level of interpretation. It seems we always need a more
powerful language to do that.

So physics becomes a first person uncertainty calculus associating to each computational state a collection of computations, hopefully with a reasonable measure (which has to be derived by the self-reference logic.

The meta-level of comprehension can be embedded in the arithmetical truth, in the same way that Gödel discovered that metamathematics can be embedded in (and retrieved from) arithmetic.

Bruno Marchal wrote:

Comp remains incomplete on God, consciousness and
souls, and can explain why, but physics, including dimensionality is
entirely explained. To be sure comp is still "hesitating" between
dimension 2 and dimension 24 for the shadow of the notion of space,
but this is a very complex mathematical problem, and it assumes that
the Z1* logic (the "divine" third person plural points of view) give
rise to some mathematical structure (Temperley-Lieb algebra, braid
But how can you formulate dimension 2 / 24 or Z1* logic in arithmetic?

Z1* is the logic of Bp & Dt & p; the p are arithmetic proposition and the B and D are the Beweisbar arithmetical predicate and its dual (D = ~B~). The Gödel-like arithmetization does the remaining work. To be sure, I use numbers, but also recursively enumerable (or not) set of numbers. Some things, like truth, and knowledge, and sensations, will not have a direct translation, but still an indirect in the mind of the humans or machine trying to understand this theory. This is what make the theory possible, and what makes it non reductionnist: from inside arithmetic, arithmetical truth is bigger than arithmetic. Technically this can be related to Skolem paradox (which is not a contradiction, but some unavoidable classical logical weirdness).

Remember that: I do assume comp, and whatever is your conception of space and dimension, this is already represented in your brain through neuronal relations (say), and those neuronal relations are themselves represented, even emulated, in arithmetic.

mean, you don't have to explain it precisely, but can you give a hint how
this could even be conceived to be possible?

I hope that what I say above helps a bit. Let me try again.

We assume comp. So you can imagine that all the "reality" we observe is secondary: we might be dreaming, or, sharing "a video game", or being in a matrix (a giant computer handling the whole game and the software of our minds), OK?

Now, and that is not obvious, but is rather well known by logicians, is that such a matrix is emulated by the arithmetical consequences of the laws of addition and multiplication. The step 8 of UDA explains that, assuming we are physical computers, we cannot distinguish a physical computer from its infinitely many arithmetical emulations, and that in fine, by taking into account the first person indeterminacy, whatever we can observe below our substitution level results from a sort of competition among *all* universal machine/ numbers. That set of numbers is not a computable set, and only God knows the winner. Yet, machines can backtrack from observation and introspection to get better and better picture.

I am not proposing an explanation of "reality", on the contrary, I show that a very common hypothesis, mechanism (made clear through Church thesis and computer science) makes the mind body problem two times more difficult than it is usually understood. It makes the physical laws more mysterious, it leads to a purely arithmetical body problem. And at first sight, it does look like a refutation of comp, because if we just look at the computations, we can expect an inflation of possibilities (the white rabbit problem). It looks like even if we were in one winning computation, perhaps physical, we are immediately at first send in a solipsistic mental space, and then get dissolve in white noise. And that, admittedly is not confirmed by the experiments nor experience, except with salvia perhaps :).

Yet this is too vague for refuting comp, and using the self-reference logic to use machines (numbers) as a model of "person" playing in that matrix, or even by interviewing directly the abstract universal Löbian Machine, the LUM, we can show that things are mathematically much more complex. In particular, the use of the most older and most classical theory of knowledge (Théétète), and of the most classical theory of Truth (Tarski), and of the most transparent theory of machine's rational believability (Gödel's arithmetical beweisbar) leads to an arithmetical interpretation of Plotinus, with, to be sure, serious departure too, but that makes the talk of the machine even more interesting. Plotinus is a rationalist platonist mystic and is closer to some form of Eastern objective idealist theology than to Aristotelian naturalism (even if it is methodologically fertile in the applications in the neighborhood).

Church thesis rehabilitates Pythagoras. CT makes Pythagoras consistent, and the comp. hyp. makes Pythagoras unavoidable and refutable (by its derived physics, and that makes comp refutable).


View this message in context: 
Sent from the Everything List mailing list archive at Nabble.com.

You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com . For more options, visit this group at http://groups.google.com/group/everything-list?hl=en .


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to