On 3/17/2010 3:34 AM, Stathis Papaioannou wrote:
On 17 March 2010 05:29, Brent Meeker<meeke...@dslextreme.com>  wrote:

I think this is a dubious argument based on our lack of understanding of
qualia.  Presumably one has many thoughts that do not result in any overt
action.  So if I lost a few neurons (which I do continuously) it might mean
that there are some thoughts I don't have or some associations I don't make,
so eventually I may "fade" to the level of consciousness of my dog.  Is my
dog a "partial zombie"?
It's certainly possible that qualia can fade without the subject
noticing, either because the change is slow and gradual or because the
change fortuitously causes a cognitive deficit as well. But this not
what the fading qualia argument is about. The argument requires
consideration of a brain change which would cause an unequivocal
change in consciousness, such as a removal of the subject's occipital
lobes. If this happened, the subject would go completely blind: he
would be unable to describe anything placed in front of his eyes, and
he would report that he could not see anything at all. That's what it
means to go blind. But now consider the case where the occipital lobes
are replaced with a black box that reproduces the I/O behaviour of the
occipital lobes, but which is postulated to lack visual qualia. The
rest of the subject's brain is intact and is forced to behave exactly
as it would if the change had not been made, since it is receiving
normal inputs from the black box. So the subject will correctly
describe anything placed in front of him, and he will report that
everything looks perfectly normal. More than that, he will have an
appropriate emotional response to what he sees, be able to paint it or
write poetry about it, make a working model of it from an image he
retains in his mind: whatever he would normally do if he saw
something. And yet, he would be a partial zombie: he would behave
exactly as if he had normal visual qualia while completely lacking
visual qualia. Now it is part of the definition of a full zombie that
it doesn't understand that it is blind, since a requirement for
zombiehood is that it doesn't understand anything at all, it just
behaves as if it does. But if the idea of qualia is meaningful at all,
you would think that a sudden drastic change like going blind should
produce some realisation in a cognitively intact subject; otherwise
how do we know that we aren't blind now, and what reason would we have
to prefer normal vision to zombie vision? The conclusion is that it
isn't possible to make a device that replicates brain function but
lacks qualia: either it is not possible to make such a device at all
because the brain is not computable, or if such a device could be made
(even a magical one) then it would necessarily reproduce the qualia as

I generally agree with the above. Maybe I misunderstood the question; but I was considering the possibility of having a continuum of lesser qualia AND corresponding lesser behavior.

However I think there is something in the above that creates the "just a recording problem". It's the hypothesis that the black box reproduces the I/O behavior. This implies the black box realizes a function, not a recording. But then the argument slips over to replacing the black box with a recording which just happens to produce the same I/O and we're led to an absurdum that a recording is conscious. But what step of the argument should we reject? The plausible possibility is that it is the different response to counterfactuals that the functional box and the recording realize. That would seem like magic - a different response depending on all the things that don't happen - except in the MWI of QM all those counterfactuals are available to make a difference..


I think the question of whether there could be a philosophical zombie is ill
posed because we don't know what is responsible for qualia.  I speculate
that they are tags of importance or value that get attached to perceptions
so that they are stored in short term memory.  Then, because evolution
cannot redesign things, the same tags are used for internal thoughts that
seem important enough to put in memory.  If this is the case then it might
be possible to design a robot which used a different method of evaluating
experience for storage and it would not have qualia like humans - but would
it have some other kind of qualia?  Since we don't know what qualia are in a
third person sense there seems to be no way to answer that.

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to