On Sep 16, 2012, at 10:42 PM, Craig Weinberg <whatsons...@gmail.com> wrote:

> Moreover, this 
> set has subsets, and we can limit our discussion to these subsets. For 
> example, if we are interested only in mass, we can simulate a human 
> perfectly using the right number of rocks. Even someone who believes 
> in an immortal soul would agree with this. 
> No, I don't agree with it at all. You are eating the menu. A quantity of mass 
> doesn't simulate anything except in your mind. Mass is a normative 
> abstraction which we apply in comparing physical bodies with each other. To 
> reduce a human being to a physical body is not a simulation is it only 
> weighing a bag of organic molecules.

I'm just saying that the mass of the human and the mass of the rocks is the 
same, not that the rocks and the human are the same. They share a property, 
which manifests as identical behaviour when they are put on scales. What's 
controversial about that?

> Yes, but there are properties of the brain that may not be relevant to 
> behaviour. Which properties are in fact important is determined by 
> experiment. For example, we may replace the myelin sheath with a 
> synthetic material that has similar electrical properties and then 
> test an isolated nerve to see if action potentials propagate in the 
> same way. If they do, then the next step is to incorporate the nerve 
> in a network and see if the pattern of firing in the network looks 
> normal. The step after that is to replace the myelin in the brain of a 
> rat to see if the animal's behaviour changes. The modified rats are 
> compared to unmodified rats by a blinded researcher to see if he can 
> tell the difference. If no-one can consistently tell the difference 
> then it is announced that the synthetic myelin appears to be a 
> functionally identical substitute for natural myelin.
> Except it isn't identical. No imitation substance is identical to the 
> original. Sooner or later the limits of the imitation will be found - or they 
> could be advantages. Maybe the imitation myelin prevents brain cancer or heat 
> stroke or something, but it also maybe prevents sensation in cold weather or 
> maybe certain amino acids now cause Parkinson's disease. There is no such 
> thing as identical. There is only 'seems identical from this measure at this 
> time'.

Yes, it's not *identical*. No-one has claimed this. And since it's not 
identical, under some possible test it would behave differently; otherwise it 
would be identical. But there are some changes which make no functional 
difference. If l have a drink of water, that changes my brain by decreasing the 
sodium concentration. But this change is not significant if we are considering 
whether I continue to manifest normal human behaviour, since firstly the brain 
is tolerant of moderate physical changes and secondly people can manifest a 
range of different behaviours and remain recognisably human and recognisably 
the same human. In other words humans have certain engineering tolerances in 
their components, and the aim in replacing components would be to do it within 
this tolerance. Perfection is not attainable by either engineers or nature.

> As is the nature 
> of science, another team of researchers may then find some deficit in 
> the behaviour of the modified rats under conditions the first team did 
> not examine. Scientists then make modifications to the formula of the 
> synthetic myelin and do the experiments again. 
> Which is great for medicine (although ultimately maybe unsustainably 
> expensive), but it has nothing to do with the assumption of identical 
> structure and the hard problem of consciousness. There is no such thing as 
> identical experience. I have suggested that in fact we can perhaps define 
> consciousness as that which has never been repeated. It is the antithesis of 
> that which can be repeated, (hence the experience of "now"), even though 
> experiences themselves can seem very repetitive. The only seem so from the 
> vantage point of a completely novel moment of consideration of the memories 
> of previous iterations.

Here is where you have misunderstood the whole aim of the thought experiment in 
the paper you have cited. The paper assumes that identical function does *not* 
necessarily result in identical consciousness and follows this idea to see 
where it leads.

> > This is the point of the thought experiment. The limitations of all forms 
> > of 
> > measurement and perception preclude all possibility of there ever being a 
> > such thing as an exhaustively complete set of third person behaviors of any 
> > system. 
> > 
> > What is it that you don't think I understand? 
> What you don't understand is that an exhaustively complete set of 
> behaviours is not required.
> Yes, it is. Not for prosthetic enhancements, or repairs to a nervous system, 
> but to replace a nervous system without replacing the person who is using it, 
> yes, there is no set of behaviors which can ever be exhaustive enough in 
> theory to accomplish that. You might be able to do it biologically, but there 
> is no reason to trust it unless and until someone can be walked off of their 
> brain for a few weeks or months and then walked back on.

The replacement components need only be within the engineering tolerance of the 
nervous system components. This is a difficult task but it is achievable in 

> I don't access an exhaustively complete 
> set of behaviours to determine if my friends are the same people from 
> day to day, and in fact they are *not* the same systems from day to 
> day, as they change both physically and psychologically. I have in 
> mind a rather vague set of behavioural behavioural limits and if the 
> people who I think are my friends deviate significantly from these 
> limits I will start to worry. 
> Which is exactly why you would not want to replace your friends with devices 
> capable only of programmed deviations. Are simulated friends 'good enough'. 
> Will it be good enough when your friends convince you to be replaced by your 
> simulation?

I assume that my friends have not been replaced by robots. If they have been 
then that means the robots can almost perfectly replicate their behaviour, 
since I (and people in general) am very good at picking up even tiny deviations 
from normal behaviour. The question then is, if the function of a human can be 
replicated this closely by a machine does that mean the consciousness can also 
be replicated? The answer is yes, since otherwise we would have the possibility 
of a person having radically different experiences but behaving normally and 
being unaware that their experiences were different.

-- Stathis Papaioannou

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to