On 8/26/2012 2:09 PM, Bruno Marchal wrote:
On 25 Aug 2012, at 15:12, benjayk wrote:
Bruno Marchal wrote:
I agree, so I don't see how I confused the levels. It seems to me you
just stated that Robinson indeed can not substitue Peano Arithmetic,
RAs emulation of PA makes only sense with respect to PA (in cases
On 24 Aug 2012, at 12:04, benjayk wrote:
But this avoides my point that we can't imagine that levels, context
ambiguity don't exist, and this is why computational emulation does
that the emulation can substitute the original.
But here you do a confusion level as I think Jason tries pointing on.
A similar one to the one made by Searle in the Chinese Room.
As emulator (computing machine) Robinson Arithmetic can simulate
exactly Peano Arithmetic, even as a prover. So for example Robinson
arithmetic can prove that Peano arithmetic proves the consistency of
But you cannot conclude from that that Robinson Arithmetic can prove
its own consistency. That would contradict Gödel II. When PA uses the
induction axiom, RA might just say "huh", and apply it for the sake of
the emulation without any inner conviction.
does a proof that RA can't do).
Right. It makes only first person sense to PA. But then RA has
succeeded in making PA alive, and PA could a posteriori realize that
the RA level was enough.
Like I converse with Einstein's brain's book (à la Hofstatdter), just
by manipulating the page of the book. I don't become Einstein through
my making of that process, but I can have a genuine conversation with
Einstein through it. He will know that he has survived, or that he
survives through that process.
Please explain this statement! How is there an "Einstein" the
person that will know anything in that case? How is such an entity
capable of "knowing" anything that can be communicated? Surely you are
not considering a consistently solipsistic version of Einstein! I don't
have a problem with that possibility per se, but you must come clean
That is, it *needs* PA to make sense, and so
we can't ultimately substitute one with the other (just in some relative
way, if we are using the result in the right way).
Yes, because that would be like substituting a person by another,
pretexting they both obeys the same role. But comp substitute the
lower process, not the high level one, which can indeed be quite
Is there a spectrum or something similar to it for substitution levels?
It is like the word "apple" cannot really substitute a picture of an
in general (still less an actual apple), even though in many context
indeed use the word "apple" instead of using a picture of an apple
we don't want to by shown how it looks, but just know that we talk about
apples - but we still need an actual apple or at least a picture to make
sense of it.
Here you make an invalid jump, I think. If I play chess on a computer,
and make a backup of it, and then continue on a totally different
computer, you can see that I will be able to continue the same game
with the same chess program, despite the computer is totally
different. I have just to re-implement it correctly. Same with comp.
Once we bet on the correct level, functionalism applies to that level
and below, but not above (unless of course if I am willing to have
some change in my consciousness, like amnesia, etc.).
But this example implies the necessity of the possibility of a
physical implementation, what is universal is that not a particular
physical system is required for the chess program.
With comp, to make things simple, we are high level programs. Their
doing is 100* emulable by any computer, by definition of programs and
I agree with this, but any thing that implies interactions between
separate minds implies seperation of implementations and this only
happens in the physical realm. Therefore the physical realm cannot be
Bruno Marchal wrote:
With Church thesis computing is an absolute notion, and all universal
machine computes the same functions, and can compute them in the same
manner as all other machines so that the notion of emulation (of
processes) is also absolute.
OK, but Chruch turing thesis is not proven and I don't consider it true,
That's fair enough. But personnally I find CT very compelling. I doubt
it less than the "yes doctor" part of comp, to be specific.
How is Deutsch's version different?
I don't consider it false either, I believe it is just a question of
level we think about computation.
This I don't understand. Computability does not depend on any level
I don't understand either.
Also, computation is just absolute relative to other computations,
respect to other levels and not even with respect to instantion of
computations through other computations. Because here instantiation and
description of the computation matter - IIIIIIIII+II=IIIIIIIIIII and
describe the same computation, yet they are different for practical
(because of a different instantiation) and are not even the same
if we take a sufficiently long computation to describe what is actually
going on (so the computations take instantiation into account in their
Comp just bet that there is a level below which any functionnally
correct substitution will preserve my consciousness. It might be that
such a level does not exist, in which case I am an actually infinite
being, and comp is false. That is possible, but out of the scope of my
Bruno, this is exactly my argument against step 8; it fails exactly
at the infinite case. COMP is omega inconsistent.
Bruno Marchal wrote:
It is not a big deal, it just mean that my ability to emulate einstein
(cf Hofstadter) does not make me into Einstein. It only makes me able
to converse with Einstein.
Apart from the question of whether brains can be emulated at all (due to
possible entaglement with their own emulation, I think I will write a
about this later), that is still not necessarily the case.
It is only the case if you know how to make sense of the emulation.
don't see that we can assume that this takes less than being einstein.
No doubt for the first person sense, that's true, even with comp. You
might clarify a bit more your point.
I am interested in benjayk answer too.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at