On 8/8/2011 5:34 AM, Stathis Papaioannou wrote:
On Mon, Aug 8, 2011 at 4:35 AM, meekerdb<meeke...@verizon.net>  wrote:
On 8/7/2011 4:42 AM, Stathis Papaioannou wrote:
That, as I keep saying, is the question. Assume that the bot can
behave like a person but lacks consciousness. Then it would be
possible to replace parts of your brain with non-conscious components
that function otherwise normally, which would lead to you lacking some
important aspect aspect of consciousness but being unaware of it.
Put that way it seems absurd.  But what about lacking consciousness but
*acting as if you were unaware* of it?  The philosophical zombie says he's
conscious and has an internal narration and imagines and dreams...but does
he?  Can we say that he must?  If he says he doesn't, can we be sure he's
lying?  Even though I think functionalism is right, I think consciousness
may be very different depending on how the internal functions are
implemented.  I go back to the example of having an inner narration in
language (which most of us didn't have before age 4).  I think Julian Jaynes
was right to suppose that this was an evolutionary accident in co-opting the
perceptual mechanism of language.  In a sense all thought may be perception;
it's just that some of it is perception of internal states.
The trick is to consider not full-blown zombies but partial zombies
based on partial brain replacement. If your visual cortex is replaced
with zombie neurons your visual qualia will disappear but the rest of
the brain will receive normal input, so you will declare that you can
see normally. The possibilities are:

(a) You can in fact see normally. In general, if the behaviour of the
brain is replicated then the consciousness is also replicated.
(b) You are blind but don't realise it, believe you have normal sight
and declare that you have normal sight.
(c) You are blind and realise you are blind but can't do anything
about it, observing helplessly as your vocal cords apparently of their
own accord declare that everything is normal.

  I think (a) is the only plausible one of theses possibilities.

I think so too. But that doesn't show that some different arrangement of functions in the brain could not produce a qualitatively different kind of consciousness. Indeed it seems that sociopaths, for example, are different in lacking an empathy "module" in their brain. Some people claim that the ability to understand higher mathematics is "built-in" and some people have it an some don't. If we build more an more intelligent, autonomous Mars Rovers I think we will necessarily instantiate consciousness - but not consciousness like our own. So I'm interested in the question of how we can know how similar and in what ways?


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to