On Fri, Oct 14, 2011 at 1:50 AM, Craig Weinberg <whatsons...@gmail.com> wrote:
>> A person who has the visual cortex of his brain replaced with a
>> functionally equivalent computer will behave as if he can see
>> normally, claim that he can see normally and believe that he can see
>> normally. It is therefore not like blindsight, where the patient has
>> deficient vision and claims that he cannot see at all. It is also not
>> like Anton's syndrome, the opposite of blindsight, where the patient
>> is blind due to a cortical lesion but has the delusional belief that
>> he can see and walks around stumbling into things.
> It is also not like blindsight or Anton's syndrome in that those
> things exist, while the idea of a 'functionally equivalent' computer
> replacement for the visual cortex may well be an unrealizable dream
> based upon a misguided approach which is almost right but ends up
> being exactly wrong. My idea of Multisense Realism suggests this idea
> is a dead end, and that has always been my point. I have no opinion
> about whether or not a functionally equivalent computer does this or
> would not do that, my opinion is only that the qualia of vision is not
> an objective function (although it informs on objective conditions in
> a way that has functional benefits), it is a subjective sense.
I think a functionally equivalent computerised brain would be very
difficult to make but that is not the point of the philosophical
argument. The point is that it should be physically possible provided
that the third person observable behaviour of the brain is computable.
For it to be computable the brain must conform to physical laws and
those laws must be computable. You have agreed that the brain will
conform to physical laws - it won't do anything magical. It remains
possible that some of the physics the brain uses is non-computable, as
Roger Penrose thinks; however, there isn't really any evidence for
If you could make a functionally equivalent artificial brain that
lacks qualia (and that applies even if the brain is not computerised)
then you would be able to create a partial zombie. A partial zombie
*behaves* normally and *believes* he has normal qualia. That means you
could be a partial zombie right now, since you behave as if you can
see and believe that you can see. If you think that is absurd (and you
have said you do) then partial zombies thus defined are impossible,
which means a brain that was functionally equivalent in its third
person observable behaviour must also be equivalent in its qualia.
I've repeated this argument several times and you have responded thus:
- It would be really difficult to make a functionally equivalent brain
(yes, I agree, but this is a philosophical argument, not an
- A brain can't be functionally equivalent without the qualia (yes,
this is assumed at the beginning because we are only discussing the
third person observable behaviour)
- Qualia are not computable (yes, we assume you are right about this
at the beginning - otherwise it would be begging the question)
- Partial zombies as redefined by you can exist (maybe, but you don't
win debates by redefining terms)
- A simulation of a thing is not the thing (yes, but the assumption is
that the simulation just controls the firing of the neurons with which
it interfaces, not that it is the same as the neurons or has qualia)
- Determinism would leave no room for feeling and decision-making (I
disagree but let's assume that you are right about this too and see
where it leads)
- Computers are cold, heartless things (not said by you in those exact
words but we also assume this at the beginning)
>> We can imagine a
>> condition of perfect blindsight in combination with Anton's syndrome:
>> the patient lacks visual qualia while responding normally to visual
>> cues and has a delusional belief that he has normal vision. The
>> problem with that is, there is no way to diagnose it: we could all be
>> suffering from it and we wouldn't know, so it is just as good as
>> normal vision.
> So if you have a stroke and find yourself trapped in a body that is
> going around killing people (sort of an Angel Heart scenario), since
> there is no way to know the difference between your behavior and you
> from, then you say it's just as good as you. We would have to treat
> that behavior as if it were criminally intentional, but I don't think
> that has to have anything at all to do with subjectivity. We have
> involuntary behaviors that are different from voluntary behaviors.
> Just because we can't tell which is which from the outside doesn't
> mean that the subjective distinction on the inside isn't critically
> important - much more important than outside appearances.
As I keep repeating, I would have to *behave* normally and *believe*
that I was normal. Since I don't normally go around killing people, if
I had a brain lesion that made me do that I would not be behaving
normally, even if I were deluded in thinking that I was.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at