On Tue, Jul 26, 2011 at 6:03 AM, Craig Weinberg <whatsons...@gmail.com> wrote:
>  > You've completely missed the point again. Perhaps you could try
>> reading Chalmers' paper if you haven't already done so:
>> http://consc.net/papers/qualia.html
>> Unfortunately some people just don't seem to understand it.
> I have read it, and it's a good way of understanding the issue if you
> are going to use the standard models of consciousness, but I have a
> model that I like better. Have you read my executive summary?
> I'm looking at subjectivity as the inverted, involuted topology of
> what we can observe (through our particular private, proprietary
> subjectivity). It is not a process which arises at some level or
> other. It's just that every phenomenon can only identify with what is
> very similar to itself. The sense that it can make of everything else
> is objectified - inside out. So making a brain out of something other
> than brain depends entirely upon how different it is from what it can
> identify with. That may prove to be achievable at a non-biological
> level, but there is no particular reason to imagine that should be the
> case. Human consciousness is as dependent upon human biology as water
> is dependent on H2O. They are the same thing. Silicon microprocessors
> are not the same thing. Programs that run on microprocessors aren't
> the same thing. If they were you could just make a program that makes
> simulates ever more powerful processors and greater quantities of
> memory. Let's start with that, because it will be a lot easier. Let's
> write a program that simulates itself running faster than it can run.

The argument in the paper is independent of any particular theory of
consciousness. It just asks the question of whether consciousness can
be separated from externally observable brain function. We could
assume for the sake of argument that consciousness is miraculous:
could God make a neuron that functions normally in its interaction
with other neurons but lacks consciousness? The answer, I think, is
no, for it would lead to absurdity. As far as I can understand your
theory, it would allow for the creation of zombie neurons, therefore
it must be wrong.

Stathis Papaioannou

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to