Brent Meeker writes:

> >>I could make a robot that, having suitable thermocouples, would quickly 
> >>withdraw it's 
> >>hand from a fire; but not be conscious of it.  Even if I provide the robot 
> >>with 
> >>"feelings", i.e. judgements about good/bad/pain/pleasure I'm not sure it 
> >>would be 
> >>conscious.  But if I provide it with "attention" and memory, so that it 
> >>noted the 
> >>painful event as important and necessary to remember because of it's strong 
> >>negative 
> >>affect; then I think it would be conscious.
> > 
> > 
> > It's interesting that people actually withdraw their hand from the fire 
> > *before* they experience 
> > the pain. The withdrawl is a reflex, presumably evolved in organisms with 
> > the most primitive 
> > central nervour systems, while the pain seems to be there as an 
> > afterthought to teach us a 
> > lesson so we won't do it again. Thus, from consideration of evolutionary 
> > utility consciousness 
> > does indeed seem to be a side-effect of memory and learning. 
> 
> Even more curious, volitional action also occurs before one is aware of it. 
> Are you 
> familiar with the experiments of Benjamin Libet and Grey Walter?

These experiments showed that in apparently voluntarily initiated motion, motor 
cortex activity 
actually preceded the subject's awareness of his intention by a substantial 
fraction of a second. 
In other words, we act first, then "decide" to act. These studies did not 
examine pre-planned 
action (presumably that would be far more technically difficult) but it is easy 
to imagine the analogous 
situation whereby the action is unconsciously "planned" before we become aware 
of our decision. In 
other words, free will is just a feeling which occurs after the fact. This is 
consistent with the logical 
impossibility of something that is neither random nor determined, which is what 
I feel my free will to be.

> > I also think that this is an argument against zombies. If it were possible 
> > for an organism to 
> > behave just like a conscious being, but actually be unconscious, then why 
> > would consciousness 
> > have evolved? 
> 
> An interesting point - but hard to give any answer before pinning down what 
> we mean 
> by consciousness.  For example Bruno, Julian Jaynes, and Daniel Dennett have 
> explanations; but they explain somewhat different consciousnesses, or at 
> least 
> different aspects.

Consciousness is the hardest thing to explain but the easiest thing to 
understand, if it's your own 
consciousness at issue. I think we can go a long way discussing it assuming 
that we do know what 
we are talking about even though we can't explain it. The question I ask is, 
why did people evolve 
with this consciousness thing, whatever it is? The answer must be, I think, 
that it is a necessary 
side-effect of the sort of neural complexity that underpins our behaviour. If 
it were not, and it 
were possible that beings could behave exactly like humans and not be 
conscious, then it would 
have been wasteful of nature to have provided us with consciousness. This does 
not necessarily 
mean that computers can be conscious: maybe if we had evolved with electronic 
circuits in our 
heads rather than neurons consciousness would not have been a necessary 
side-effect. 

Stathis Papaioannou
_________________________________________________________________
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---

Reply via email to