On Tuesday, September 18, 2012 7:14:17 PM UTC-4, stathisp wrote:
>
> On Tue, Sep 18, 2012 at 1:43 PM, Craig Weinberg 
> <whats...@gmail.com<javascript:>> 
> wrote: 
>
> >> No, the paper does *not* assume that there is a set of functions that 
> >> if reproduced will will cause consciousness. It assumes that something 
> >> like what you are saying is right. 
> > 
> > 
> > By assume I mean the implicit assumptions which are unstated in the 
> paper. 
> > The thought experiment comes out of a paradox arising from assumptions 
> about 
> > qualia and the brain which are both false in my view. I see the brain as 
> the 
> > flattened qualia of human experience. 
>
> Chalmer's position is that functionalism is true, and he states this 
> in the introduction, but this is not *assumed* in the thought 
> experiment. The thought experiment explicitly assumes that 
> functionalism is *false*; that consciousness is dependent on the 
> substrate and swapping a brain for a functional equivalent will not 
> necessarily give rise to the same consciousness or any consciousness 
> at all. Isn't that what you believe? 
>

I believe that there is ontologically no such thing as a functional 
equivalent of an organism by an inorganic mechanism. If you use stem cells 
as the functional equivalent, then it could work fine. There is no 'good 
enough' as a citeria for being alive.
 

>
> >> And if it were possible to replicate the behaviour without the 
> >> experiences - i.e. make a zombie - it would be possible to make a 
> >> partial zombie, which lacks some experiences but behaves normally and 
> >> doesn't realise that it lacks those experiences. Do you agree that 
> >> this is the implication? If not, where is the flaw in the reasoning? 
> > 
> > 
> > The word zombie implies that you have an expectation of consciousness 
> but 
> > there isn't any. That is a fallacy from the start, since there is not 
> reason 
> > to expect a simulation to have any experience at all. It's not a zombie, 
> > it's a puppet. 
>
> Replace the word "zombie" with "puppet" if that makes it easier to 
> understand. 
>

I have no trouble understanding what you are saying.
 

>
> > A partial zombie is just someone who has brain damage, and yes if you 
> tried 
> > to replace enough of a person's brain with a non-biological material, 
> you 
> > would get brain damage, dementia, coma, and death. 
>
> Not if the puppet components perform the same purely mechanical 
> functions as the original components. 


I am saying that consciousness is not a mechanical function, so it makes no 
difference if you have a trillion little puppet strings pushing dendrites 
around, there is still nothing there that experiences anything as a whole.
 

> In order for this to happen 
> according to the paper you have to accept that the physics of the 
> brain is in fact computable. If it is computable, then we can model 
> the behaviour of the brain, 


Except that we can't, because the behavior of the brain is contingent on 
the real experience of the person who is using that brain to experience 
their life. You would have to model the entire cosmos and separate out the 
experiences of a single person to model the brain.
 

> although according to the assumptions in 
> the paper (which coincide with your assumptions) 


No, they don't. I say that the paper's explicit assumptions are based on 
incorrect implicit assumptions (as are yours) that consciousness is the end 
product of brain mechanisms. I see consciousness as the beginning and 
ending of all things, and the brain as a representation of certain kinds of 
experiences.
 

> modeling the 
> behaviour won't reproduce the consciousness. All the evidence we have 
> suggests that physics is computable, but it might not be. It may turn 
> out that there is some exotic physics in the brain which requires 
> solving the halting problem, for example, in order to model it, and 
> that would mean that a computer could not adequately simulate those 
> components of the brain which utilise this physics. But going beyond 
> the paper, the argument for functionalism (substrate-independence of 
> consciousness) could still be made by considering theoretical 
> components with non-biological hypercomputers. 
>

Will functionalism make arsenic edible? Will it use numbers to cook food?

My point is this. I am programmed, but I am not a program. An electronic 
computer is also programmed but not a program. It doesn't matter what kind 
of program is installed on either one, neither of us can become the other. 

Craig 

>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/everything-list/-/RyS6KhUIlm0J.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to