Stathis Papaioannou wrote:
> On 2/20/07, *Mark Peaty* <[EMAIL PROTECTED] 
> <mailto:[EMAIL PROTECTED]>> wrote:
>     Stathis:'Would any device that can create a representation of the
>     world, itself and the relationship between the world and itself be
>     conscious?'
>     MP: Well that, in a nutshell, is how I understand it; with the
>     proviso that it is dynamic: that all representations of all salient
>     features and relationships are being updated sufficiently often to
>     deal with all salient changes in the environment and self. In the
>     natural world this occurs because all the creatures in the past
>     who/which failed significantly in this respect got eaten by
>     something that stalked its way in between the updates, or the
>     creature in effect did not pay enough attention to its environment
>     and in consequence lost out somehow in ever contributing to the
>     continuation of its specie's gene pool.
>     Stathis [in another response to me in this thread]: 'You can't prove
>     that a machine will be conscious in the same way you are.'
>     MP: Well, that depends what you mean;
>        1. to what extent does it matter what I can prove anyway?   
>        2. exactly what or, rather, what range of sufficiently complex
>           systems are you referring to as 'machines';
>        3. what do you mean by 'conscious in the same way you are'?;
>     I'm sure others can think of equally or more interesting questions
>     than these, but I can respond to these.
>        1. I am sure I couldn't prove whether or not a machine was
>           conscious, but it the 'machine' was, and it was smart enough
>           and interested enough IT could, by engaging us in conversation
>           about its experiences, what it felt like to be what/who it is,
>           and questioning us about what it is like to be us.
>           Furthermore, as Colin Hales has pointed out, if the machine
>           was doing real science it would be pretty much conclusive that
>           it was conscious.
>        2. By the word machine I could refer to many of the biological
>           entities that are significantly less complex than humans. What
>           ever one says in this respect, someone somewhere is going to
>           disagree, but I think maybe insects and the like could be
>           quite reasonably be classed as sentient machines with near
>           Zombie status.
>        3. If we accept a rough and ready type of physicalism, and
>           naturalism maybe the word I am looking for here, then it is
>           pretty much axiomatic that the consciousness of a
>           creature/machine will differ from mine in the same degree that
>           its body, instinctive behaviour, and environmental niche
>           differ from mine. I think this must be true of all sentient
>           entities. Some of the people I know are 'colour blind'; about
>           half the people I know are female; many of the people I know
>           exhibit quite substantial differences in temperament and
>           predispositions. I take it that these differences from me are
>           real and entail various real differences in the quality of
>           what it is like to be them [or rather their brain's updating
>           of the model of them in their worlds].
>         I am interested in birds [and here is meant the feathered
>         variety] and often speculate about why they are doing what they
>         do and what it may be like to be them. They have very small
>         heads compared to mine so their brains can update their models
>         of self in the world very much faster than mine can. This must
>         mean that their perceptions of time and changes are very
>         different. To them I must be a very slow and stupid seeming
>         terrestrial giant. Also many birds can see by means of ultra
>         violet light. This means that many things such as flowers and
>         other birds will look very different compared to what I see.
>         [Aside: I am psyching myself up slowly to start creating a
>         flight simulator program that flies birds rather than aircraft.
>         One of the challenges - by no mean the hardest though -  will be
>         to represent UV reflectance in a meaningful way.]
> 1. If it behaved as if it were conscious *and* it did this using the 
> same sort of hardware as I am using (i.e. a human brain) then I would 
> agree that almost certainly it is conscious. If the hardware were on a 
> different substrate but a direct analogue of a human brain and the 
> result was a functionally equivalent machine then I would be almost as 
> confident, but if the configuration were completely different I would 
> not be confident that it was conscious and I would bet that at least it 
> was differently conscious. As for scientific research, I never managed 
> to understand why Colin thought this was more than just a version of the 
> Turing test.
> 2. I don't consider biological machines to be fundamentally different to 
> other machines.
> 3. Sure, different entities with (at least) functionally different 
> brains will be differently conscious. But I like to use "conscious in 
> the way I am" in order to avoid having to explain or define 
> consciousness in general, or my consciousness in particular. I can 
> meaningfully talk about "seeing red" to a blind person who has no idea 
> what the experience is like: What wavelengths of light lead me to see 
> red? Can I still see red if my eyes are closed or my optic nerve 
> severed? What if I have a stroke in the visual cortex? What if certain 
> parts of my cortex are electrically stimulated? That is, I can go a very 
> long way with the definition "that experience which i have when a red 
> coloured object enters my visual field".
>     Stathis [from the other posting again]: 'There is good reason to
>     believe that the third person observable behaviour of the brain can
>     be emulated, because the brain is just chemical reactions and
>     chemistry is a well-understood field.'
>     MP: Once again it depends what you mean. Does 'Third person
>     observable behaviour of the brain' include EEG recordings and the
>     output of MRI imaging? Or do you mean just the movements of muscles
>     which is the main indicator of brain activity? If the former then I
>     think that would be very hard, perhaps impossible; if the latter
>     however, that just might be achievable. 
> Huh? I think it would be a relatively trivial matter to emulate MRI and 
> EEG data, certainly compared to emulating behaviour as evidenced by 
> muscle activity (complex, intelligent behaviour such as doing science or 
> writing novels is after all just muscle activity, which is just chemical 
> reactions in the muscles triggered by chemical reactions in the brain).
>     Stathis: 'I think it is very unlikely that something as elaborate as
>     consciousness could have developed with no evolutionary purpose
>     (evolution cannot distinguish between me and my zombie twin if
>     zombies are possible), but it is a logical possibility.'
>     MP: I agree with the first bit, but I do not agree with the last
>     bit. If you adopt what I call UMSITW [the Updating Model of Self In
>     The World], then anything which impinges on consciousness, has a
>     real effect on the brain. In effect the only feasible zombie like
>     persons you will meet will either be sleep walking or otherwise
>     deficient as a consequence of drug use or brain trauma. I think
>     Oliver Sachs's book The Man Who Mistook His Wife For a Hat gives
>     many examples illustrating the point that all deficiencies in
>     consciousness correlate strictly with lesions in the sufferer's brain.
> A human with an intact brain behaving like an awake human could not 
> really be a zombie unless you believe in magic. However, it is possible 
> to conceive of intelligently-behaving beings who do not have an internal 
> life because they lack the right sort of brains. I am not suggesting 
> that this is the case and there are reasons to think it is unlikely to 
> be the case, but it is not ruled out by any empirical observation.
> Stathis Papaioannou

The problem is that there doesn't seem to be any conceivable observation that 
could rule it out.  So by Popper's rule it is a not a scientific proposition 
but rather a metaphysical one.  This is another way of saying that there is no 
agreed upon way of assigning a truth or probability value to it.

Brent Meeker

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at

Reply via email to