On Tuesday, September 18, 2012 6:08:46 AM UTC-4, rclough wrote:
>
> Hi Craig Weinberg   
>
> IMHO conscousness is not really anything in itself, 
> it is what the brain makes of its contents that the self 
> perceives. 


It gets tricky. Depends what you mean by a thing. I would say that 
consciousness is the less-than-anything and the more-than-anything which 
experiences the opposite of itself as somethings. It is otherthanthing. In 
order to think or talk about this, we need to represent it as a subjective 
idea 'thing'.

Make no mistake though. The brain is nothing but an experience of many 
things, of our mind's experience of our body using our body's experience of 
medical instruments. The capacity to experience is primary. No structure 
can generate an experience unless it is made out of something which already 
has that capacity. If I make a perfect model of H2O out of anything other 
than actual hydrogen and oxygen atoms, I will not get water.
 

> The self is intelligence, which is   
> able to focus all pertinent brain activity to a unified point. 
>

You don't need intelligence to have a self. Infants are pretty selfish, and 
not terribly intelligent. Brain activity is overrated as well. Jellyfish 
and worms have no brain. Bacteria have no brains, yet they behave 
intelligently (see also quorum sensing). Intelligence is everywhere - just 
not human intelligence.

Craig
 

>
> Roger Clough, rcl...@verizon.net <javascript:> 
> 9/18/2012   
> "Forever is a long time, especially near the end." 
> Woody Allen 
>
> ----- Receiving the following content -----   
> From: Craig Weinberg   
> Receiver: everything-list   
> Time: 2012-09-17, 23:43:08 
> Subject: Re: Zombieopolis Thought Experiment 
>
>
>
>
> On Monday, September 17, 2012 11:02:16 PM UTC-4, stathisp wrote: 
> On Tue, Sep 18, 2012 at 6:39 AM, Craig Weinberg  wrote:   
>
> > I understand that, but it still assumes that there is a such thing as a 
> set   
> > of functions which could be identified and reproduced that cause   
> > consciousness. I don't assume that, because consciousness isn't like   
> > anything else. It is the source of all functions and appearances, not 
> the   
> > effect of them. Once you have consciousness in the universe, then it can 
> be   
> > enhanced and altered in infinite ways, but none of them can replace the 
>   
> > experience that is your own.   
>
> No, the paper does *not* assume that there is a set of functions that   
> if reproduced will will cause consciousness. It assumes that something   
> like what you are saying is right.   
>
>
> By assume I mean the implicit assumptions which are unstated in the paper. 
> The thought experiment comes out of a paradox arising from assumptions 
> about qualia and the brain which are both false in my view. I see the brain 
> as the flattened qualia of human experience. 
>   
>
>
> >>> > This is the point of the thought experiment. The limitations of all 
>   
> >>> > forms of   
> >>> > measurement and perception preclude all possibility of there ever 
> being   
> >>> > a   
> >>> > such thing as an exhaustively complete set of third person behaviors 
> of   
> >>> > any   
> >>> > system.   
> >>> >   
> >>> > What is it that you don't think I understand?   
> >>>   
> >>> What you don't understand is that an exhaustively complete set of   
> >>> behaviours is not required.   
> >>   
> >>   
> >> Yes, it is. Not for prosthetic enhancements, or repairs to a nervous   
> >> system, but to replace a nervous system without replacing the person 
> who is   
> >> using it, yes, there is no set of behaviors which can ever be 
> exhaustive   
> >> enough in theory to accomplish that. You might be able to do it   
> >> biologically, but there is no reason to trust it unless and until 
> someone   
> >> can be walked off of their brain for a few weeks or months and then 
> walked   
> >> back on.   
> >>   
> >>   
> >> The replacement components need only be within the engineering 
> tolerance   
> >> of the nervous system components. This is a difficult task but it is   
> >> achievable in principle.   
> >   
> >   
> > You assume that consciousness can be replaced, but I understand exactly 
> why   
> > it can't. You can believe that there is no difference between scooping 
> out   
> > your brain stem and replacing it with a functional equivalent as long as 
> it   
> > was well engineered, but to me it's a completely misguided notion.   
> > Consciousness doesn't exist on the outside of us. Engineering only deals 
>   
> > with exteriors. If the universe were designed by engineers, there could 
> be   
> > no consciousness.   
>
> Yes, that is exactly what the paper assumes. Exactly that!   
>
>
> It still is modeling the experience of qualia as having a quantitative 
> relation with the ratio of brain to non-brain. That isn't the only way to 
> model it, and I use a different model.   
>
>
> >> I assume that my friends have not been replaced by robots. If they have 
>   
> >> been then that means the robots can almost perfectly replicate their   
> >> behaviour, since I (and people in general) am very good at picking up 
> even   
> >> tiny deviations from normal behaviour. The question then is, if the 
> function   
> >> of a human can be replicated this closely by a machine does that mean 
> the   
> >> consciousness can also be replicated? The answer is yes, since 
> otherwise we   
> >> would have the possibility of a person having radically different   
> >> experiences but behaving normally and being unaware that their 
> experiences   
> >> were different.   
> >   
> >   
> > The answer is no. A cartoon of Bugs Bunny has no experiences but behaves 
>   
> > just like Bugs Bunny would if he had experiences. You are eating the 
> menu.   
>
> And if it were possible to replicate the behaviour without the   
> experiences - i.e. make a zombie - it would be possible to make a   
> partial zombie, which lacks some experiences but behaves normally and   
> doesn't realise that it lacks those experiences. Do you agree that   
> this is the implication? If not, where is the flaw in the reasoning?   
>
>
> The word zombie implies that you have an expectation of consciousness but 
> there isn't any. That is a fallacy from the start, since there is not 
> reason to expect a simulation to have any experience at all. It's not a 
> zombie, it's a puppet. 
>
> A partial zombie is just someone who has brain damage, and yes if you 
> tried to replace enough of a person's brain with a non-biological material, 
> you would get brain damage, dementia, coma, and death. 
>
> Craig 
>   
>
>
>
> --   
> Stathis Papaioannou   
>
> --   
> You received this message because you are subscribed to the Google Groups 
> "Everything List" group. 
> To view this discussion on the web visit 
> https://groups.google.com/d/msg/everything-list/-/nrqkIqoR6xMJ. 
> To post to this group, send email to 
> everyth...@googlegroups.com<javascript:>. 
>
> To unsubscribe from this group, send email to 
> everything-li...@googlegroups.com <javascript:>. 
> For more options, visit this group at 
> http://groups.google.com/group/everything-list?hl=en. 
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/everything-list/-/J0zrRDzijqwJ.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to