On 19 Jan 2014, at 20:35, Jason Resch wrote:




On Sun, Jan 19, 2014 at 11:38 AM, Bruno Marchal <marc...@ulb.ac.be> wrote:

On 19 Jan 2014, at 03:39, LizR wrote:

It would seem that "sufficiently advanced technology" will eventually be able to detect all the neural correlates of consciousness.

Betting on some theory. Betting on some substitution level. Beware the charlatan.



Maybe a p-zombie should be defined as something that has the neural correlates of consciousness but is still somehow not conscious.

Yes. Good idea.



Or that there ain't no such animal.

We can logically conceive them. Imagine a dead corpse. You can easily conceive that he is not conscious. Now, animate the dead corpse so that it behaves like he was alive, but keep conceiving that it is unconscious, a bit like an actor in a movie, except it interacts "relevantly" with you.

There is no flagrant contradiction. And that is all you need to conceive them logically, without choosing any theory in particular.

Now, in some theory, that can become contradictory, or having an infinitesimal plausiblity.

You can conceive zombie, like you can conceive Santa Klaus.
No need to believe in them, nor even in their plausibility to be conceivable.


My problem with this though is if a zombie is physically indistinguishable, then all the same information content exists in the zombie brain as the non-zombie brain. So is it not correct to say the zombie knows something when it possesses all the right information in its brain? If it knows it sees colors, knows it has beliefs, knows it is conscious, but somehow is not conscious, this, to me, seems like a contradiction.

I would say that this argument is a good argument for accepting the comp link between knowledge and consciousness. I agree with you point, as an argument that with comp (or weaker functionalism) the zombie notion is close to non-sense. But again, a priori, comp might be false, and a non comp believer might conceive a knowing machine (in the sense of Theaetetus) being unconscious. He would see that the machine believes p, and that p is true, but still believe that the machine has no consciousness related to that knowledge. It makes no sense with comp, or weaker, but it makes sense in any theory of mind which would posit an external factor as being needed for consciousness (like matter for a materialist à-la Peter Jones, a small primitive reality for someone stopping at step 7 in the UDA, or a fairy tale notion of God, like the belief that everyone might be a zombie until you are baptized, or something). With logic alone, zombie are conceivable. With logic + arithmetic, it is already more hard, with logic + arithmetic + comp (or perhaps just rationalism), it becomes even harder. I agree.

Bruno






Jason

--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to