On 8/12/2011 2:00 AM, Bruno Marchal wrote:
On 11 Aug 2011, at 19:24, meekerdb wrote:
On 8/11/2011 7:14 AM, Stathis Papaioannou wrote:
In any case, I have made the thought experiment simpler by*assuming*
that the replacement component is mechanically equivalent to the
biological tissue. We can imagine that it is a black box animated by
God, who makes it tickle the surrounding neural tissue in exactly the
right way. I think installation of such a device would*necessarily*
preserve consciousness. What do you think?
Are you assuming that there is no "consciousness" in the black box?
The problem is there. In fine, if the brain works like a machine,
consciousness is not related to its physical activity---which is a sub
level notion, but to the mathematical relation defining the high level
computation leading to the subject computational state.
A related problem: is the back box supposed to be counterfactually
correct or not, or is the black box accidentally correct for one
execution. Answering that question in keeping the mechanist assumption
leads to the "in fine" just above.
I think we are close to the question: does comp entails the 323 principle?
Right. If you idealize the brain as a digital computer then it seems
that register 323 is unnecessary. But the brain, like every thing else,
is a quantum object and it is characteristic of QM that possible
interactions that don't occur make a difference. Of course you may
object that QM can be computed by a (classical) digital computer - but
that's on true on in an Everttian interpretation. The digital computer
can't compute which interactions occur and which don't; that
probabilistic. All it can do is compute the probabilities for all the
possible outcomes, *inculding* the 323 ones.
I don't insist on that, but it is a key to get the *necessary*
ontological reduction to arithmetic (which does not entail an
epistemological reduction).
I agree with what you say in another post: the term "behavior" is
ambiguous. The notion of substitution level disambiguates a part of
it, and the notion of counterfactuality disambiguates an orthogonal
part of it.
Am I right in thinking that the counterfactuality includes *everything*
that didn't happen? I'm not sure that's a coherent concept.
Brent
Bruno
http://iridia.ulb.ac.be/~marchal/ <http://iridia.ulb.ac.be/%7Emarchal/>
--
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en.
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en.