On Thu, Sep 20, 2012 at 6:27 AM, Craig Weinberg <whatsons...@gmail.com> wrote:

>> Chalmer's position is that functionalism is true, and he states this
>> in the introduction, but this is not *assumed* in the thought
>> experiment. The thought experiment explicitly assumes that
>> functionalism is *false*; that consciousness is dependent on the
>> substrate and swapping a brain for a functional equivalent will not
>> necessarily give rise to the same consciousness or any consciousness
>> at all. Isn't that what you believe?
> I believe that there is ontologically no such thing as a functional
> equivalent of an organism by an inorganic mechanism. If you use stem cells
> as the functional equivalent, then it could work fine. There is no 'good
> enough' as a citeria for being alive.

I've explained what "functional equivalent" means in this case. A
replacement neurological component is functionally equivalent to the
biological one if it results in a similar sequence of neural firings
ultimately causing muscle contraction. Note the important word
"similar". If I make a functional equivalent of you it means it will
behave similarly enough to you that people who know you can't tell a
difference. It does not necessarily mean that the functional
equivalent will display exactly the same behaviour that you will, and
in fact it would be extremely unlikely that even an atom for atom copy
would display exactly the same behaviour.

> I am saying that consciousness is not a mechanical function, so it makes no
> difference if you have a trillion little puppet strings pushing dendrites
> around, there is still nothing there that experiences anything as a whole.

Yes, yes, yes that is EXACTLY the assumption made in the thought
experiment. Sorry for shouting. How else can I say it so that it's

>> In order for this to happen
>> according to the paper you have to accept that the physics of the
>> brain is in fact computable. If it is computable, then we can model
>> the behaviour of the brain,
> Except that we can't, because the behavior of the brain is contingent on the
> real experience of the person who is using that brain to experience their
> life. You would have to model the entire cosmos and separate out the
> experiences of a single person to model the brain.

No you wouldn't. All you have to model is when a neuron will fire
given a certain input. If you can do this then you can model when the
motor neurons fire and you can model behaviour. You then have a zombie
or a puppet (in your view and in Chalmers' thought experiment) which
moves around like a person or an animal but lacks any experience. For
example, when you stick a pin in it it says "ouch!" and says it will
hit you if you do it again, but it doesn't actually feel any pain, it
doesn't actually understand what has happened, and it doesn't actually
care about your actions or anything else. Is that clear enough?

>> although according to the assumptions in
>> the paper (which coincide with your assumptions)
> No, they don't. I say that the paper's explicit assumptions are based on
> incorrect implicit assumptions (as are yours) that consciousness is the end
> product of brain mechanisms. I see consciousness as the beginning and ending
> of all things, and the brain as a representation of certain kinds of
> experiences.

And the assumption in the paper is also that consciousness is not
involved in the process whereby electronic circuits cause muscles to

>> modeling the
>> behaviour won't reproduce the consciousness. All the evidence we have
>> suggests that physics is computable, but it might not be. It may turn
>> out that there is some exotic physics in the brain which requires
>> solving the halting problem, for example, in order to model it, and
>> that would mean that a computer could not adequately simulate those
>> components of the brain which utilise this physics. But going beyond
>> the paper, the argument for functionalism (substrate-independence of
>> consciousness) could still be made by considering theoretical
>> components with non-biological hypercomputers.
> Will functionalism make arsenic edible? Will it use numbers to cook food?

If functionalism is true then it will allow you to replace your brain
with a machine and remain you.

> My point is this. I am programmed, but I am not a program. An electronic
> computer is also programmed but not a program. It doesn't matter what kind
> of program is installed on either one, neither of us can become the other.

But you become the person you are in a year's time, even though almost
all the atoms in your body have changed.

Stathis Papaioannou

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to