On 8/7/2011 4:42 AM, Stathis Papaioannou wrote:
That, as I keep saying, is the question. Assume that the bot can
behave like a person but lacks consciousness. Then it would be
possible to replace parts of your brain with non-conscious components
that function otherwise normally, which would lead to you lacking some
important aspect aspect of consciousness but being unaware of it.


Put that way it seems absurd. But what about lacking consciousness but *acting as if you were unaware* of it? The philosophical zombie says he's conscious and has an internal narration and imagines and dreams...but does he? Can we say that he must? If he says he doesn't, can we be sure he's lying? Even though I think functionalism is right, I think consciousness may be very different depending on how the internal functions are implemented. I go back to the example of having an inner narration in language (which most of us didn't have before age 4). I think Julian Jaynes was right to suppose that this was an evolutionary accident in co-opting the perceptual mechanism of language. In a sense all thought may be perception; it's just that some of it is perception of internal states.

Brent

This
is absurd, but it is a corollary of the claim that it is possible to
separate consciousness from function. Therefore, the claim that it is
possible to separate consciousness from function is shown to be false.
If you don't accept this then you allow what you have already admitted
is an absurdity.

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to