On Oct 13, 7:04 pm, Stathis Papaioannou <stath...@gmail.com> wrote:
> On Fri, Oct 14, 2011 at 1:50 AM, Craig Weinberg <whatsons...@gmail.com> wrote:
> >> A person who has the visual cortex of his brain replaced with a
> >> functionally equivalent computer will behave as if he can see
> >> normally, claim that he can see normally and believe that he can see
> >> normally. It is therefore not like blindsight, where the patient has
> >> deficient vision and claims that he cannot see at all. It is also not
> >> like Anton's syndrome, the opposite of blindsight, where the patient
> >> is blind due to a cortical lesion but has the delusional belief that
> >> he can see and walks around stumbling into things.
> > It is also not like blindsight or Anton's syndrome in that those
> > things exist, while the idea of a 'functionally equivalent' computer
> > replacement for the visual cortex may well be an unrealizable dream
> > based upon a misguided approach which is almost right but ends up
> > being exactly wrong. My idea of Multisense Realism suggests this idea
> > is a dead end, and that has always been my point. I have no opinion
> > about whether or not a functionally equivalent computer does this or
> > would not do that, my opinion is only that the qualia of vision is not
> > an objective function (although it informs on objective conditions in
> > a way that has functional benefits), it is a subjective sense.
> I think a functionally equivalent computerised brain would be very
> difficult to make but that is not the point of the philosophical
> argument. The point is that it should be physically possible provided
> that the third person observable behaviour of the brain is computable.
The physical behavior of the brain is not the same thing as the
biological and neurological behavior. Each level not only becomes more
complex and thus more difficult to compute, but introduces more and
more factors which are truly uncomputable. Subjectivity plays a
greater and greater role as the more complex units become more
empowered decision makers with more options and more time to consider
their own preferences. Physical level phenomena has less bandwidth,
shallower subjectivity so that decisions are automatic reflexes
passing from input to output and back with no time to interpret them.
> For it to be computable the brain must conform to physical laws and
> those laws must be computable. You have agreed that the brain will
> conform to physical laws - it won't do anything magical.
As soon as a cell becomes alive, we cannot meaningfully describe it's
existence in purely physical or chemical terms. It can only be reduced
to the biological level and still be understood as a cell. From the
point of view of physics, chemistry is magic. From chemistry, biology
> possible that some of the physics the brain uses is non-computable, as
> Roger Penrose thinks; however, there isn't really any evidence for
What would such evidence look like? Is there any evidence to support
the idea that human feeling is computable?
> If you could make a functionally equivalent artificial brain that
> lacks qualia (and that applies even if the brain is not computerised)
> then you would be able to create a partial zombie. A partial zombie
> *behaves* normally and *believes* he has normal qualia. That means you
> could be a partial zombie right now, since you behave as if you can
> see and believe that you can see. If you think that is absurd (and you
> have said you do) then partial zombies thus defined are impossible,
> which means a brain that was functionally equivalent in its third
> person observable behaviour must also be equivalent in its qualia.
The notion of a functionally equivalent artificial brain is what is
absurd, so that all of the bogus hypothetical ideas that follow from
it are also garbage. I understand the thought experiment, but it
doesn't hold water because it assumes functionalism a priori, then
uses it's erroneous conclusion to justify functionalism through
The whole idea of third person observable behavior is also a non-
sequitur. What is the third person observable behavior of a Chinese
character? Does the reproduction of a Chinese character produce
equivalent qualia in a person who can read Chinese versus one who
cannot? A brain is the same way. Without a human consciousness using
the brain, it's just a mass of meaningless tissue, or a colony of
microorganisms, or a matrix of sampled electromagnetic coordinates,
etc. It has no meaningful observable behaviors. They only become
meaningful to us when we relate them to our subjective experiences
which we already find meaningful. Without those as a starting point,
there is nothing about the brain which is worth simulating.
I understand that you think I'm not getting the point that you have to
agree to the thought experiment conditions that include comp, but I do
understand that. You don't understand that I see the problem with this
thought experiment to bother with it. Yes, if functionalism could be
true, then function would be all that is required to do anything, and
if function is all that is required to do anything then anything that
has the same function would have to do everything exactly the same.
It's circular. You could say the same thing with anything. If instead
of comp, we decide to do a thought experiment where we decide that
anything that that casts the same shadow must be the same thing, then
if we make something with the exact same shadow then it must be the
same thing that we have made. It's a fallacy. I can make a volleyball
and call it a soccer ball when it isn't.
> I've repeated this argument several times and you have responded thus:
> - It would be really difficult to make a functionally equivalent brain
> (yes, I agree, but this is a philosophical argument, not an
> engineering project)
Not just difficult, but but potentially impossible, depending on your
definition of equivalent.
> - A brain can't be functionally equivalent without the qualia (yes,
> this is assumed at the beginning because we are only discussing the
> third person observable behaviour)
> - Qualia are not computable (yes, we assume you are right about this
> at the beginning - otherwise it would be begging the question)
> - Partial zombies as redefined by you can exist (maybe, but you don't
> win debates by redefining terms)
> - A simulation of a thing is not the thing (yes, but the assumption is
> that the simulation just controls the firing of the neurons with which
> it interfaces, not that it is the same as the neurons or has qualia)
The simulation is supposed to replace the neurons. That's what it's
> - Determinism would leave no room for feeling and decision-making (I
> disagree but let's assume that you are right about this too and see
> where it leads)
> - Computers are cold, heartless things (not said by you in those exact
> words but we also assume this at the beginning)
There's two different things. There is computation, which is not so
much cold and heartless as empty and lifeless, but accurate and
precise. Then there is silicon semiconductors which are materials
which have properties that suit computation, not because they are
empty and lifeless, but because they have qualities of glass, metal,
and plastic. It is malleable when you want it to be, rigid when you
don't. It does what it's told without going of on it's own in some
kind of organic unpredictability. The limitation of computation can be
seen when we try to execute a computer program using live hamsters as
bits. Doesn't work very well.
> >> We can imagine a
> >> condition of perfect blindsight in combination with Anton's syndrome:
> >> the patient lacks visual qualia while responding normally to visual
> >> cues and has a delusional belief that he has normal vision. The
> >> problem with that is, there is no way to diagnose it: we could all be
> >> suffering from it and we wouldn't know, so it is just as good as
> >> normal vision.
> > So if you have a stroke and find yourself trapped in a body that is
> > going around killing people (sort of an Angel Heart scenario), since
> > there is no way to know the difference between your behavior and you
> > from, then you say it's just as good as you. We would have to treat
> > that behavior as if it were criminally intentional, but I don't think
> > that has to have anything at all to do with subjectivity. We have
> > involuntary behaviors that are different from voluntary behaviors.
> > Just because we can't tell which is which from the outside doesn't
> > mean that the subjective distinction on the inside isn't critically
> > important - much more important than outside appearances.
> As I keep repeating, I would have to *behave* normally and *believe*
> that I was normal.
You don't have to keep repeating it, I got it the first time, weeks
ago. I understand the conditions of the thought experiment, I just
think it's foolish to try to entertain it because I already understand
that human behavior supervenes upon personal experience as well as
brain function. They are intertwined so that one cannot be reproduced
without the other.
> Since I don't normally go around killing people, if
> I had a brain lesion that made me do that I would not be behaving
> normally, even if I were deluded in thinking that I was.
Past behavior is not an indication of potential behavior. Killing
isn't normal for anyone, until they start killing things.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at