On Oct 14, 11:08 am, Stathis Papaioannou <stath...@gmail.com> wrote:
> On Fri, Oct 14, 2011 at 1:04 PM, Craig Weinberg <whatsons...@gmail.com> wrote:
> >> If you could make a functionally equivalent artificial brain that
> >> lacks qualia (and that applies even if the brain is not computerised)
> >> then you would be able to create a partial zombie. A partial zombie
> >> *behaves* normally and *believes* he has normal qualia. That means you
> >> could be a partial zombie right now, since you behave as if you can
> >> see and believe that you can see. If you think that is absurd (and you
> >> have said you do) then partial zombies thus defined are impossible,
> >> which means a brain that was functionally equivalent in its third
> >> person observable behaviour must also be equivalent in its qualia.
>
> > The notion of a functionally equivalent artificial brain is what is
> > absurd, so that all of the bogus hypothetical ideas that follow from
> > it are also garbage. I understand the thought experiment, but it
> > doesn't hold water because it assumes functionalism a priori, then
> > uses it's erroneous conclusion to justify functionalism through
> > circular reasoning.
>
> Functionalism assumes that the qualia will be reproduced if the
> observable function of the brain is reproduced. The thought experiment
> assumes that this is *not* the case. You have therefore missed one of
> the most basic points.

You are ignoring my point. Function is in the eye of the observer. Do
you understand that this is true or do you insist that there is an
absolute reality that is beyond any particular observation?

>
> > The whole idea of third person observable behavior is also a non-
> > sequitur. What is the third person observable behavior of a Chinese
> > character? Does the reproduction of a Chinese character produce
> > equivalent qualia in a person who can read Chinese versus one who
> > cannot? A brain is the same way. Without a human consciousness using
> > the brain, it's just a mass of meaningless tissue, or a colony of
> > microorganisms, or a matrix of sampled electromagnetic coordinates,
> > etc. It has no meaningful observable behaviors. They only become
> > meaningful to us when we relate them to our subjective experiences
> > which we already find meaningful. Without those as a starting point,
> > there is nothing about the brain which is worth simulating.
>
> The third person observable behaviour of a Chinese character is the
> way it reflects the light.

Not true. It could be carved in wood or cast in bronze so that it can
be read by touching it. There is no difference in the way something
reflects the light unless there is something that can tell the
difference.

> It is not how the character is interpreted.
> The third person observable behaviour of a neuron is the timing and
> voltage of the action potential, the type and amount of
> neurotransmitter released at the synapse, and so on.

Those characteristics are only observable using specific insruments to
extend the body into the microcosm. When we use different instruments,
we get different observations. Our subjective experience is just a set
of observations using different instruments.

> It is not what
> qualia are associated with these activities. Are you now clear on what
> third person person observable behaviour (which usually is just called
> "behaviour") means?

I am clear that you don't understand what I am talking about.

> > I understand that you think I'm not getting the point that you have to
> > agree to the thought experiment conditions that include comp, but I do
> > understand that. You don't understand that I see the problem with this
> > thought experiment to bother with it. Yes, if functionalism could be
> > true, then function would be all that is required to do anything, and
> > if function is all that is required to do anything then anything that
> > has the same function would have to do everything exactly the same.
> > It's circular. You could say the same thing with anything. If instead
> > of comp, we decide to do a thought experiment where we decide that
> > anything that that casts the same shadow must be the same thing, then
> > if we make something with the exact same shadow then it must be the
> > same thing that we have made. It's a fallacy. I can make a volleyball
> > and call it a soccer ball when it isn't.
>
> No, as I have repeatedly said the initial assumption is that comp is
> wrong, functionalism is wrong.

It's intended to show that assuming that is a problem though.

>
> >> I've repeated this argument several times and you have responded thus:
> >> - It would be really difficult to make a functionally equivalent brain
> >> (yes, I agree, but this is a philosophical argument, not an
> >> engineering project)
>
> > Not just difficult, but but potentially impossible, depending on your
> > definition of equivalent.
>
> The artificial brain part is functionally equivalent if the rest of
> the brain carries on in the usual (third person observable) manner.

It depends entirely on who or what the third person is. You haven't
figured out yet thateach observer is capable of observing differently
so that there is no such thing as a quality that is just observable in
general.

>
> >> - A brain can't be functionally equivalent without the qualia (yes,
> >> this is assumed at the beginning because we are only discussing the
> >> third person observable behaviour)
> >> - Qualia are not computable (yes, we assume you are right about this
> >> at the beginning - otherwise it would be begging the question)
> >> - Partial zombies as redefined by you can exist (maybe, but you don't
> >> win debates by redefining terms)
> >> - A simulation of a thing is not the thing (yes, but the assumption is
> >> that the simulation just controls the firing of the neurons with which
> >> it interfaces, not that it is the same as the neurons or has qualia)
>
> > The simulation is supposed to replace the neurons. That's what it's
> > simulating.
>
> The simulation interfaces with the other neurons so that their pattern
> of firing is the same as it would have been before. The simulation by
> assumption does not reproduce the qualia. It also need not reproduce
> other aspects of the neurons, such as their mass or colour, unless
> this is relevant to their interactions with the other neurons.
>

I'm talking about digital simultion to replace the whole brain, not
prosthetic additions.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to