On Fri, Aug 12, 2011 at 7:00 PM, Bruno Marchal <marc...@ulb.ac.be> wrote:
> On 11 Aug 2011, at 19:24, meekerdb wrote:
> On 8/11/2011 7:14 AM, Stathis Papaioannou wrote:
> In any case, I have made the thought experiment simpler by *assuming*
> that the replacement component is mechanically equivalent to the
> biological tissue. We can imagine that it is a black box animated by
> God, who makes it tickle the surrounding neural tissue in exactly the
> right way. I think installation of such a device would *necessarily*
> preserve consciousness. What do you think?
> Are you assuming that there is no "consciousness" in the black box?
> The problem is there. In fine, if the brain works like a machine,
> consciousness is not related to its physical activity---which is a sub level
> notion, but to the mathematical relation defining the high level computation
> leading to the subject computational state.
> A related problem: is the back box supposed to be counterfactually correct
> or not, or is the black box accidentally correct for one execution.
> Answering that question in keeping the mechanist assumption leads to the "in
> fine" just above.
> I think we are close to the question: does comp entails the 323 principle?
> I don't insist on that, but it is a key to get the *necessary* ontological
> reduction to arithmetic (which does not entail an epistemological
> I agree with what you say in another post: the term "behavior" is ambiguous.
> The notion of substitution level disambiguates a part of it, and the notion
> of counterfactuality disambiguates an orthogonal part of it.
If the black box did not have consciousness if it functioned without
the right counterfactual behaviour (for example if it happened to
provide the right inputs randomly for a period) then that would allow
the creation of partial zombies. What do you think of partial zombies
as a concept? Could you be a partial zombie now; for example, could
you be blind or unable to understand language but just not realise it?
Incidentally, I don't understand why philosophers and contributors to
this list are affronted by the idea that a random device or a
recording could sustain consciousness. There seems to be no logical
contradiction or empirical problem with the idea, but people just
don't like it.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at