On 13 Aug 2011, at 15:19, Stathis Papaioannou wrote:
On Fri, Aug 12, 2011 at 7:00 PM, Bruno Marchal <marc...@ulb.ac.be>
On 11 Aug 2011, at 19:24, meekerdb wrote:
On 8/11/2011 7:14 AM, Stathis Papaioannou wrote:
>>In any case, I have made the thought experiment simpler by
>>that the replacement component is mechanically equivalent to the
>>biological tissue. We can imagine that it is a black box animated
>>God, who makes it tickle the surrounding neural tissue in exactly
>>right way. I think installation of such a device would
>>preserve consciousness. What do you think?
> Are you assuming that there is no "consciousness" in the black box?
The problem is there. In fine, if the brain works like a machine,
consciousness is not related to its physical activity---which is a
notion, but to the mathematical relation defining the high level
leading to the subject computational state.
A related problem: is the back box supposed to be counterfactually
or not, or is the black box accidentally correct for one execution.
Answering that question in keeping the mechanist assumption leads
to the "in
fine" just above.
I think we are close to the question: does comp entails the 323
I don't insist on that, but it is a key to get the *necessary*
reduction to arithmetic (which does not entail an epistemological
I agree with what you say in another post: the term "behavior" is
The notion of substitution level disambiguates a part of it, and
of counterfactuality disambiguates an orthogonal part of it.
If the black box did not have consciousness if it functioned without
the right counterfactual behaviour (for example if it happened to
provide the right inputs randomly for a period) then that would allow
the creation of partial zombies. What do you think of partial zombies
as a concept?
It does not make sense. Chalmers argument is valid. But this is only
part of an (older) argument which leads to the complete abandon of the
physical supervenience thesis.
Suppose a teacher is in front of his classroom answering questions of
Then at time t, his brain stops completely to function, but a cosmic
explosion, happening ten years before, sent, by pure chance, a flux of
cosmic rays which supplies correctly the inputs to its muscle (but NOT
inside its brain), so that his behavior remains unchanged for the time
of the student lesson. Then he dies.
Was the guy a zombie?
Imagine that the student don't ask any question. Then the cosmic rays
needs only to make it looks just quiet behind its desk. No neurons
works at all, and the cosmic rays supplies very little information in
its cerebral stem so that he does not fall. Is the guy a zombie?
I would say it is. But now, the very fact that I do not think that a
partial zombie is possible makes me abandon the idea that
consciousness is related to the physical activity of the brain. The
consciousness of the guy supervenes on all computations (in a
continuum of digital computations (as viewed from inside from a first
person perspective). It does not supervene on a physical body, because
a physical body does not exist, it is only part of coherent mind
In a sense, we, as we see ourselves as bodies, *are* zombies (total
zombie). But this is misleading, because this makes sense only when we
understand that the bodies are already creation of the mind, in the
way computer science can explain with the UD (the sigma_1 sentences),
and the self-reference logics.
Could you be a partial zombie now; for example, could
you be blind or unable to understand language but just not realise it?
I could suffer an agnosologia which makes me blind and amnesic on
anything related to vision, so that personally I don't see the
difference. But I will have to infer that there is some kind of
problem about finding objects and walking without bumping into the
furniture. But in that case I would not say that I am a partial
zombie. I am fully conscious, but handicapped and amnesic. I don't
believe the notion partial zombie make sense in the absolute.
Total zombie can make sense, in a partial sense different from above,
like a fake policeman on the road, which behave like a policeman in
the eyes of the drivers, but has presumably no consciousness.
Incidentally, I don't understand why philosophers and contributors to
this list are affronted by the idea that a random device or a
recording could sustain consciousness. There seems to be no logical
contradiction or empirical problem with the idea, but people just
don't like it.
With comp consciousness is associated with a computation, and then
with an infinity of them. Something random can only be a first person
geographical or contingent type of experience, like in the iteration
of the WM duplications. So indeed, I think it does not make sense to
attribute neither consciousness, nor even a computation to something
random. The comp idea is that a computation makes sense, and, for
animals, reflects some self-referential abilities needed for surviving.
If a random device generates by chance a correct computation, you
might or not attribute to it consciousness, because, in *all* cases,
the consciousness itself is related to infinities of computation in
the tiny sigma_1 complete platonia.
I would not attribute consciousness to the teacher above, because the
rays does not even emulate its brain, just a minimal number of inputs.
If I attribute him consciousness, then I can attribute Einstein's or
anyone's consciousness to a thermostat, and all supervenience theses
(the phys or comp one) get trivial. You would not say "yes" to a
doctor who proposes to substitute your brain for a thermostat, all
If a magical doctor propose you a magical brain, which will works
completely randomly, but actually by chance fits with all events in
your life, including the processing of your intimate thought, comp
might allow you to say yes, but not because your consciousness will
supervene on the magical brain, just because, by chance, your body
will follow, at the right level of description, the "correct"
computations which exist a-temporally in the sigma1-arithmetical truth.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at