On 11 Aug 2011, at 19:24, meekerdb wrote:

On 8/11/2011 7:14 AM, Stathis Papaioannou wrote:

In any case, I have made the thought experiment simpler by *assuming*
that the replacement component is mechanically equivalent to the
biological tissue. We can imagine that it is a black box animated by
God, who makes it tickle the surrounding neural tissue in exactly the
right way. I think installation of such a device would *necessarily*
preserve consciousness. What do you think?

Are you assuming that there is no "consciousness" in the black box?

The problem is there. In fine, if the brain works like a machine, consciousness is not related to its physical activity---which is a sub level notion, but to the mathematical relation defining the high level computation leading to the subject computational state.

A related problem: is the back box supposed to be counterfactually correct or not, or is the black box accidentally correct for one execution. Answering that question in keeping the mechanist assumption leads to the "in fine" just above.

I think we are close to the question: does comp entails the 323 principle?

I don't insist on that, but it is a key to get the *necessary* ontological reduction to arithmetic (which does not entail an epistemological reduction).

I agree with what you say in another post: the term "behavior" is ambiguous. The notion of substitution level disambiguates a part of it, and the notion of counterfactuality disambiguates an orthogonal part of it.



You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to