On 13.08.2011 14:08 Stathis Papaioannou said the following:
On Sat, Aug 13, 2011 at 9:45 PM, Evgenii Rudnyi<use...@rudnyi.ru>
If your visual cortex is replaced by an electronic device that
produces the appropriate outputs at its borders, the rest of
your brain will respond normally.

This is just an assumption. I guess that at present one cannot
prove or disprove it. Let me quote an opposite assumption from
Jeffrey Gray (p. 232, section 15.5 Consciousness in a brain

How could the rest of your brain possibly respond differently if it
receives exactly the same stimulation? Perhaps you mean that it
would be able to tell that there is an artificial device there due
to electric fields and so on; but in that case the artificial device
is not appropriately reproducing the I/O behaviour of the original

The question is what does it mean the same stimulation. I guess that you mean now only electrical signals. However, it well might be the qualia plays the role as well.

If I understand you correctly, you presume that conscious experience could be resolved within 'normal science' (there is no Hard Problem). Jeffrey Gray on the other hand acknowledges the Hard Problem and he believes that a new scientific theory will be needed to solve it.

"Might it be the case that, if one put a slice of V4 in a dish in
this way, it could continue to sustain colour qualia?
Functionalists have a clear answer to this question: no, because a
slice of V4, disconnected from its normal visual inputs and motor
outputs, cannot discharge the functions associated with the
experience of colour. But, if we had a theory that started, not
from function, but from brain tissue, maybe it would give a
different answer. Alas, no such theory is to hand. Worse, even one
had been proposed, there is no known way of detecting qualia in a
brain slice!".

It's not clear that an isolated piece of brain tissue would have
normal qualia since it may require the whole brain or at least a
large part of the brain to produce qualia. A neuron in the language
centre won't have an understanding of a small part of the letter

We do not know this now. It was just an idea in the book (among many other ideas). It seems to me though that such an idea is at the same level as to suppose that a robot will have conscious experience automatically.


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to