> The argument was more of the type : "removal of unnecessay and
> unconscious or unintelligent parts. Those parts have just no
> perspective. If they have some perpective playing arole in Alice's
> consciousness, it would mean we have not well chosen the substitution
> level. You are reintroducing some consciousness on the elementary
> parts, here, I think.

The problem would not be with removing individual elementary parts and
replacing them with functionally equivalent pieces; this obviously
preserves the whole. Rather with removing whole subgraphs and
replacing them with equivalent pieces. As Alice-in-the-cave is
supposed to show, this can remove consciousness, at least in the limit
when the entire movie is replaced...

> Then you think that if someone is conscious with some brain, which for
> some reason, does never use some neurons, could loose consciousness
> when that never used neuron is removed?
> If that were true, how could still be confident with an artificial
> digital brain. You may be right, but the MEC hypothesis would be put
> in doubt.

I am thinking of it as being the same as someone having knowledge
which they never actually use. Suppose that the situation is so
extreme that if we removed the neurons involved in that knowledge, we
will not alter the person's behavior; yet, we will have removed the
knowledge. Similarly, if the behavior of Alice in practice comes from
a recording, yet a dormant conscious portion is continually ready to
intervene if needed, then removing that dormant portion removes her


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to [EMAIL PROTECTED]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at

Reply via email to