John M writes (quoting SP):

> St:
> Are you suggesting that a brain with the same
> pattern of neurons firing, but without the appropriate environmental
> stimulus, would not have exactly the same conscious experience?
> 
> [JM]:
> Show me, I am an experimentalist.  First show two brains with the same 
> pattern of  (ALL!)   neuron firings. Two extracted identical firings in a 
> superdupercomplex brain is meaningless.
> Then, please, show me (experimentally) the non-identity of environmental 
> impacts reaching 2 different brains from the unlimited interaction of the 
> totality.
> (I wrote already that I do not approve thought-experiments).

Of course, you could not have both brains stimulated in the usual manner in 
both environments because then they would not have identical patterns of 
neural firing; you would have to artificially stimulate one of the brains in 
exactly 
the right manner to mimic the stimulation it would receive via its sense 
organs. 
That would be very difficult to achieve in a practical experiment, but the 
question 
is, *if* you could do this would you expect that the brains would be able to 
guess 
on the basis of their subjective experience alone which one was which? 

Actually, "natural" experiments something like this occur in people going 
through a 
psychotic episode. Most people who experience auditory hallucinations find it 
impossible to distinguish between the hallucination and the real thing: the 
voices 
sound *exactly* as it sounds when someone is talking to them, which is why (if 
they are that sort of person) they might assault a stranger on the train in the 
belief 
that they have insulted or threatened them, when the poor fellow has said 
nothing 
at all. I think this example alone is enough to show that it is possible to 
have a 
perception with cortical activity alone; you don't even need to artificially 
stimulate 
the auditory nerve.

> St:
> That would imply some sort of extra-sensory perception, and there is
> no evidence for such a thing. It is perfectly consistent with all the facts
> to say that consciousness results from patterns of neurons firing in the
> brain, and that if the same neurons fired, the same experience would
> result regardless of what actually caused those neurons to fire.
> 
> [JM]:
> regardless also of the 'rest of the brain'? Would you pick one of the 
> billions copmpleting the brainwork complexity and match it to a similar one 
> in a different complexity?
> But the more relevant question (and I mean it):
> What would you identify as (your version) of "consciousness" that "results 
> from neuron-fiting" consistent with all the facts?

My neurons fire and I am conscious; if they didn't fire I wouldn't be 
conscious, 
and if they fired very differently to the way they are doing I would be 
differently 
conscious. That much, I think, is obvious. Maybe there is something *in 
addition* 
to the physical activity of our neurons which underpins consciousness, but at 
the 
moment it appears that the neurons are both necessary and sufficient, so you 
would have to present some convincing evidence (experimental is always best, as 
you say, but theoretical will do) if you want to claim otherwise.

> St:
> As for consciousness being fundamentally irreducible, I agree
> completely.
> 
> [JM]:
> Consider it a singularity, a Ding an Sich? Your statement looks to me as 
> referring to a "thing". Not a process. Or rather a state? (Awareness??)
> *
> St:
> It is a fact that when neurons fire in a particular way, a conscious 
> experience results; possibly, complex enough electronic activity in a 
> digital computer might also result in conscious experience, although we 
> cannot be sure of that. But this does not mean that the conscious experience 
> *is* the brain or computer activity, even if it could somehow be shown that 
> the physical process is necessary and sufficient for the experience.
> 
> [JM]:
> I hope you could share with us your version of that "conscious experience" 
> as well, which "could" be assigned to a digital computer? What "other" 
> activity may a digital computer have
> beside "electronic"?
> It is hard to show in 'parallel' observed phenopmena whether  one is 
> 'necessary' for the other, or just observervable in parallel? Maybe "the 
> other" is necessary for the 'one'?
> If you find that the 'physical' process (firing, or electronic) is 
> SUFFICIENT then probably your definition is such that it allows such 
> sufficiency.
> I may question the complexity of the assigned situation
> for such simplification,.

I don't know that computers can be conscious, and I don't even know that 
computers can emulate human-type intelligent behaviour. Proving the latter 
lies in the domain of experimental science, while proving the former is 
impossible,  
although it is also impossible to *prove* that another person is conscious. 

> St:
> Consciousness is something entirely different and, if you like, mysterious, 
> in a category of its own.
> 
> [JM]:
> Now you are talking! Thanks

But I might add, saying that consciousness is mysterious and in a category of 
its own in no way suggests that computers cannot be conscious.

Stathis Papaioannou

_________________________________________________________________
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---

Reply via email to