On Monday, September 17, 2012 11:02:16 PM UTC-4, stathisp wrote:
>
> On Tue, Sep 18, 2012 at 6:39 AM, Craig Weinberg 
> <whats...@gmail.com<javascript:>> 
> wrote: 
>
> > I understand that, but it still assumes that there is a such thing as a 
> set 
> > of functions which could be identified and reproduced that cause 
> > consciousness. I don't assume that, because consciousness isn't like 
> > anything else. It is the source of all functions and appearances, not 
> the 
> > effect of them. Once you have consciousness in the universe, then it can 
> be 
> > enhanced and altered in infinite ways, but none of them can replace the 
> > experience that is your own. 
>
> No, the paper does *not* assume that there is a set of functions that 
> if reproduced will will cause consciousness. It assumes that something 
> like what you are saying is right. 
>

By assume I mean the implicit assumptions which are unstated in the paper. 
The thought experiment comes out of a paradox arising from assumptions 
about qualia and the brain which are both false in my view. I see the brain 
as the flattened qualia of human experience.
 

>
> >>> > This is the point of the thought experiment. The limitations of all 
> >>> > forms of 
> >>> > measurement and perception preclude all possibility of there ever 
> being 
> >>> > a 
> >>> > such thing as an exhaustively complete set of third person behaviors 
> of 
> >>> > any 
> >>> > system. 
> >>> > 
> >>> > What is it that you don't think I understand? 
> >>> 
> >>> What you don't understand is that an exhaustively complete set of 
> >>> behaviours is not required. 
> >> 
> >> 
> >> Yes, it is. Not for prosthetic enhancements, or repairs to a nervous 
> >> system, but to replace a nervous system without replacing the person 
> who is 
> >> using it, yes, there is no set of behaviors which can ever be 
> exhaustive 
> >> enough in theory to accomplish that. You might be able to do it 
> >> biologically, but there is no reason to trust it unless and until 
> someone 
> >> can be walked off of their brain for a few weeks or months and then 
> walked 
> >> back on. 
> >> 
> >> 
> >> The replacement components need only be within the engineering 
> tolerance 
> >> of the nervous system components. This is a difficult task but it is 
> >> achievable in principle. 
> > 
> > 
> > You assume that consciousness can be replaced, but I understand exactly 
> why 
> > it can't. You can believe that there is no difference between scooping 
> out 
> > your brain stem and replacing it with a functional equivalent as long as 
> it 
> > was well engineered, but to me it's a completely misguided notion. 
> > Consciousness doesn't exist on the outside of us. Engineering only deals 
> > with exteriors. If the universe were designed by engineers, there could 
> be 
> > no consciousness. 
>
> Yes, that is exactly what the paper assumes. Exactly that! 
>

It still is modeling the experience of qualia as having a quantitative 
relation with the ratio of brain to non-brain. That isn't the only way to 
model it, and I use a different model. 

>
> >> I assume that my friends have not been replaced by robots. If they have 
> >> been then that means the robots can almost perfectly replicate their 
> >> behaviour, since I (and people in general) am very good at picking up 
> even 
> >> tiny deviations from normal behaviour. The question then is, if the 
> function 
> >> of a human can be replicated this closely by a machine does that mean 
> the 
> >> consciousness can also be replicated? The answer is yes, since 
> otherwise we 
> >> would have the possibility of a person having radically different 
> >> experiences but behaving normally and being unaware that their 
> experiences 
> >> were different. 
> > 
> > 
> > The answer is no. A cartoon of Bugs Bunny has no experiences but behaves 
> > just like Bugs Bunny would if he had experiences. You are eating the 
> menu. 
>
> And if it were possible to replicate the behaviour without the 
> experiences - i.e. make a zombie - it would be possible to make a 
> partial zombie, which lacks some experiences but behaves normally and 
> doesn't realise that it lacks those experiences. Do you agree that 
> this is the implication? If not, where is the flaw in the reasoning? 
>

The word zombie implies that you have an expectation of consciousness but 
there isn't any. That is a fallacy from the start, since there is not 
reason to expect a simulation to have any experience at all. It's not a 
zombie, it's a puppet.

A partial zombie is just someone who has brain damage, and yes if you tried 
to replace enough of a person's brain with a non-biological material, you 
would get brain damage, dementia, coma, and death.

Craig
 

>
>
> -- 
> Stathis Papaioannou 
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/everything-list/-/nrqkIqoR6xMJ.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to