Re: The Irreducibility of Consciousness

2006-08-06 Thread Brent Meeker

John M wrote:
> Brent:
> My idea was exactly what you thundered against. There is no adequate proof, 
> science is a limited model-view, the quote from J, Neumann even more so and 
> the court-proof is the compromise (called law) between conflicting interests 
> in a society. Reasonable doubt relies on how stupid the contemplators are.
> The 'model' you formulate and examine is  based on a limited view of already 
> esta blished circle of relevance within those explanations  people sweated 
> out based on inadequate observational methods, immature conditions  and 
> thought limited by the appropriate era's epistemic cognitive inventory.

That's a complicated sentence and I'm not sure what you mean - but I 
formulated no model.  I said that scientific (and common sense) theories 
*are models*.  They certainly are not confined to an "already established 
circle...etc".  Otherwise all physics would still be Newtonian and there'd 
be no quantum mechanics and relativity, much less string theory and MWI.

> \Disregarding the 'rest' (maybe not even knowing about more at that time_).
> I am not sitting in a complacent lukewarm water of a limited knowledge-base 
> and cut my thinking accordingly - rather confess to my ignorance and TRY to 
> comeup with better.

So what have you come up with?  Is it not a model, but reality itself?

Brent Meeker


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



Re: The Irreducibility of Consciousness

2006-08-06 Thread John M

Brent:
My idea was exactly what you thundered against. There is no adequate proof, 
science is a limited model-view, the quote from J, Neumann even more so and 
the court-proof is the compromise (called law) between conflicting interests 
in a society. Reasonable doubt relies on how stupid the contemplators are.
The 'model' you formulate and examine is  based on a limited view of already 
esta blished circle of relevance within those explanations  people sweated 
out based on inadequate observational methods, immature conditions  and 
thought limited by the appropriate era's epistemic cognitive inventory.
\Disregarding the 'rest' (maybe not even knowing about more at that time_).
I am not sitting in a complacent lukewarm water of a limited knowledge-base 
and cut my thinking accordingly - rather confess to my ignorance and TRY to 
comeup with better.
I am not alone in this, not too efficient either.

John M
- Original Message - 
From: "Brent Meeker" <[EMAIL PROTECTED]>
To: 
Sent: Sunday, August 06, 2006 5:15 PM
Subject: Re: The Irreducibility of Consciousness



Stathis Papaioannou wrote:
> John M writes (quoting SP):
>
>
>>St:
>>Are you suggesting that a brain with the same
>>pattern of neurons firing, but without the appropriate environmental
>>stimulus, would not have exactly the same conscious experience?
>>
>>[JM]:
>>Show me, I am an experimentalist.  First show two brains with the same
>>pattern of  (ALL!)   neuron firings. Two extracted identical firings in a
>>superdupercomplex brain is meaningless.
>>Then, please, show me (experimentally) the non-identity of environmental
>>impacts reaching 2 different brains from the unlimited interaction of the
>>totality.
>>(I wrote already that I do not approve thought-experiments).
>
>
> Of course, you could not have both brains stimulated in the usual manner 
> in
> both environments because then they would not have identical patterns of
> neural firing; you would have to artificially stimulate one of the brains 
> in exactly
> the right manner to mimic the stimulation it would receive via its sense 
> organs.
> That would be very difficult to achieve in a practical experiment, but the 
> question
> is, *if* you could do this would you expect that the brains would be able 
> to guess
> on the basis of their subjective experience alone which one was which?
>
> Actually, "natural" experiments something like this occur in people going 
> through a
> psychotic episode. Most people who experience auditory hallucinations find 
> it
> impossible to distinguish between the hallucination and the real thing: 
> the voices
> sound *exactly* as it sounds when someone is talking to them, which is why 
> (if
> they are that sort of person) they might assault a stranger on the train 
> in the belief
> that they have insulted or threatened them, when the poor fellow has said 
> nothing
> at all. I think this example alone is enough to show that it is possible 
> to have a
> perception with cortical activity alone; you don't even need to 
> artificially stimulate
> the auditory nerve.
>
>
>>St:
>>That would imply some sort of extra-sensory perception, and there is
>>no evidence for such a thing. It is perfectly consistent with all the 
>>facts
>>to say that consciousness results from patterns of neurons firing in the
>>brain, and that if the same neurons fired, the same experience would
>>result regardless of what actually caused those neurons to fire.
>>
>>[JM]:
>>regardless also of the 'rest of the brain'? Would you pick one of the
>>billions copmpleting the brainwork complexity and match it to a similar 
>>one
>>in a different complexity?
>>But the more relevant question (and I mean it):
>>What would you identify as (your version) of "consciousness" that "results
>>from neuron-fiting" consistent with all the facts?
>
>
> My neurons fire and I am conscious; if they didn't fire I wouldn't be 
> conscious,
> and if they fired very differently to the way they are doing I would be 
> differently
> conscious. That much, I think, is obvious. Maybe there is something *in 
> addition*
> to the physical activity of our neurons which underpins consciousness, but 
> at the
> moment it appears that the neurons are both necessary and sufficient, so 
> you
> would have to present some convincing evidence (experimental is always 
> best, as
> you say, but theoretical will do) if you want to claim otherwise.
>
>
>>St:
>>As for consciousness being fundamentally irreducible, I agree
>>completely.
>>
>>[JM]:
>>Conside

Re: The Irreducibility of Consciousness

2006-08-06 Thread Brent Meeker

Stathis Papaioannou wrote:
> John M writes (quoting SP):
> 
> 
>>St:
>>Are you suggesting that a brain with the same
>>pattern of neurons firing, but without the appropriate environmental
>>stimulus, would not have exactly the same conscious experience?
>>
>>[JM]:
>>Show me, I am an experimentalist.  First show two brains with the same 
>>pattern of  (ALL!)   neuron firings. Two extracted identical firings in a 
>>superdupercomplex brain is meaningless.
>>Then, please, show me (experimentally) the non-identity of environmental 
>>impacts reaching 2 different brains from the unlimited interaction of the 
>>totality.
>>(I wrote already that I do not approve thought-experiments).
> 
> 
> Of course, you could not have both brains stimulated in the usual manner in 
> both environments because then they would not have identical patterns of 
> neural firing; you would have to artificially stimulate one of the brains in 
> exactly 
> the right manner to mimic the stimulation it would receive via its sense 
> organs. 
> That would be very difficult to achieve in a practical experiment, but the 
> question 
> is, *if* you could do this would you expect that the brains would be able to 
> guess 
> on the basis of their subjective experience alone which one was which? 
> 
> Actually, "natural" experiments something like this occur in people going 
> through a 
> psychotic episode. Most people who experience auditory hallucinations find it 
> impossible to distinguish between the hallucination and the real thing: the 
> voices 
> sound *exactly* as it sounds when someone is talking to them, which is why 
> (if 
> they are that sort of person) they might assault a stranger on the train in 
> the belief 
> that they have insulted or threatened them, when the poor fellow has said 
> nothing 
> at all. I think this example alone is enough to show that it is possible to 
> have a 
> perception with cortical activity alone; you don't even need to artificially 
> stimulate 
> the auditory nerve.
> 
> 
>>St:
>>That would imply some sort of extra-sensory perception, and there is
>>no evidence for such a thing. It is perfectly consistent with all the facts
>>to say that consciousness results from patterns of neurons firing in the
>>brain, and that if the same neurons fired, the same experience would
>>result regardless of what actually caused those neurons to fire.
>>
>>[JM]:
>>regardless also of the 'rest of the brain'? Would you pick one of the 
>>billions copmpleting the brainwork complexity and match it to a similar one 
>>in a different complexity?
>>But the more relevant question (and I mean it):
>>What would you identify as (your version) of "consciousness" that "results 
>>from neuron-fiting" consistent with all the facts?
> 
> 
> My neurons fire and I am conscious; if they didn't fire I wouldn't be 
> conscious, 
> and if they fired very differently to the way they are doing I would be 
> differently 
> conscious. That much, I think, is obvious. Maybe there is something *in 
> addition* 
> to the physical activity of our neurons which underpins consciousness, but at 
> the 
> moment it appears that the neurons are both necessary and sufficient, so you 
> would have to present some convincing evidence (experimental is always best, 
> as 
> you say, but theoretical will do) if you want to claim otherwise.
> 
> 
>>St:
>>As for consciousness being fundamentally irreducible, I agree
>>completely.
>>
>>[JM]:
>>Consider it a singularity, a Ding an Sich? Your statement looks to me as 
>>referring to a "thing". Not a process. Or rather a state? (Awareness??)
>>*
>>St:
>>It is a fact that when neurons fire in a particular way, a conscious 
>>experience results; possibly, complex enough electronic activity in a 
>>digital computer might also result in conscious experience, although we 
>>cannot be sure of that. But this does not mean that the conscious experience 
>>*is* the brain or computer activity, even if it could somehow be shown that 
>>the physical process is necessary and sufficient for the experience.
>>
>>[JM]:
>>I hope you could share with us your version of that "conscious experience" 
>>as well, which "could" be assigned to a digital computer? What "other" 
>>activity may a digital computer have
>>beside "electronic"?
>>It is hard to show in 'parallel' observed phenopmena whether  one is 
>>'necessary' for the other, or just observervable in parallel? Maybe "the 
>>other" is necessary for the 'one'?
>>If you find that the 'physical' process (firing, or electronic) is 
>>SUFFICIENT then probably your definition is such that it allows such 
>>sufficiency.
>>I may question the complexity of the assigned situation
>>for such simplification,.
> 
> 
> I don't know that computers can be conscious, and I don't even know that 
> computers can emulate human-type intelligent behaviour. Proving the latter 
> lies in the domain of experimental science, while proving the former is 
> impossible,  
> although it is also impossible

RE: The Irreducibility of Consciousness

2006-08-05 Thread Stathis Papaioannou

John M writes (quoting SP):

> St:
> Are you suggesting that a brain with the same
> pattern of neurons firing, but without the appropriate environmental
> stimulus, would not have exactly the same conscious experience?
> 
> [JM]:
> Show me, I am an experimentalist.  First show two brains with the same 
> pattern of  (ALL!)   neuron firings. Two extracted identical firings in a 
> superdupercomplex brain is meaningless.
> Then, please, show me (experimentally) the non-identity of environmental 
> impacts reaching 2 different brains from the unlimited interaction of the 
> totality.
> (I wrote already that I do not approve thought-experiments).

Of course, you could not have both brains stimulated in the usual manner in 
both environments because then they would not have identical patterns of 
neural firing; you would have to artificially stimulate one of the brains in 
exactly 
the right manner to mimic the stimulation it would receive via its sense 
organs. 
That would be very difficult to achieve in a practical experiment, but the 
question 
is, *if* you could do this would you expect that the brains would be able to 
guess 
on the basis of their subjective experience alone which one was which? 

Actually, "natural" experiments something like this occur in people going 
through a 
psychotic episode. Most people who experience auditory hallucinations find it 
impossible to distinguish between the hallucination and the real thing: the 
voices 
sound *exactly* as it sounds when someone is talking to them, which is why (if 
they are that sort of person) they might assault a stranger on the train in the 
belief 
that they have insulted or threatened them, when the poor fellow has said 
nothing 
at all. I think this example alone is enough to show that it is possible to 
have a 
perception with cortical activity alone; you don't even need to artificially 
stimulate 
the auditory nerve.

> St:
> That would imply some sort of extra-sensory perception, and there is
> no evidence for such a thing. It is perfectly consistent with all the facts
> to say that consciousness results from patterns of neurons firing in the
> brain, and that if the same neurons fired, the same experience would
> result regardless of what actually caused those neurons to fire.
> 
> [JM]:
> regardless also of the 'rest of the brain'? Would you pick one of the 
> billions copmpleting the brainwork complexity and match it to a similar one 
> in a different complexity?
> But the more relevant question (and I mean it):
> What would you identify as (your version) of "consciousness" that "results 
> from neuron-fiting" consistent with all the facts?

My neurons fire and I am conscious; if they didn't fire I wouldn't be 
conscious, 
and if they fired very differently to the way they are doing I would be 
differently 
conscious. That much, I think, is obvious. Maybe there is something *in 
addition* 
to the physical activity of our neurons which underpins consciousness, but at 
the 
moment it appears that the neurons are both necessary and sufficient, so you 
would have to present some convincing evidence (experimental is always best, as 
you say, but theoretical will do) if you want to claim otherwise.

> St:
> As for consciousness being fundamentally irreducible, I agree
> completely.
> 
> [JM]:
> Consider it a singularity, a Ding an Sich? Your statement looks to me as 
> referring to a "thing". Not a process. Or rather a state? (Awareness??)
> *
> St:
> It is a fact that when neurons fire in a particular way, a conscious 
> experience results; possibly, complex enough electronic activity in a 
> digital computer might also result in conscious experience, although we 
> cannot be sure of that. But this does not mean that the conscious experience 
> *is* the brain or computer activity, even if it could somehow be shown that 
> the physical process is necessary and sufficient for the experience.
> 
> [JM]:
> I hope you could share with us your version of that "conscious experience" 
> as well, which "could" be assigned to a digital computer? What "other" 
> activity may a digital computer have
> beside "electronic"?
> It is hard to show in 'parallel' observed phenopmena whether  one is 
> 'necessary' for the other, or just observervable in parallel? Maybe "the 
> other" is necessary for the 'one'?
> If you find that the 'physical' process (firing, or electronic) is 
> SUFFICIENT then probably your definition is such that it allows such 
> sufficiency.
> I may question the complexity of the assigned situation
> for such simplification,.

I don't know that computers can be conscious, and I don't even know that 
computers can emulate human-type intelligent behaviour. Proving the latter 
lies in the domain of experimental science, while proving the former is 
impossible,  
although it is also impossible to *prove* that another person is conscious. 

> St:
> Consciousness is something entirely different and, if you like, mysterious, 
> in a category of

Re: The Irreducibility of Consciousness

2006-08-04 Thread John M

Dear Stathis,
you touched a 'conscious/' nerve in me.
Let me concentrate on your text and interleave my remarks and questions.
John M
- Original Message - 
From: "Stathis Papaioannou" <[EMAIL PROTECTED]>
To: "Tom Caylor" 
Sent: Thursday, August 03, 2006 9:37 PM
Subject: RE: The Irreducibility of Consciousness




Tom Caylor writes:

> I totally agree that consciousness requires "outside" interaction.
> That's the way we are.  We are living beings that exist in a world.
> We, as we are, couldn't exist otherwise.  Things happen.  We interact.
> We make other things happen.  The question of consciousness is a
> contradiction.  The question is trying to reduce consciousness to
> something less than it is.  Even Bruno's number world leads him to
> believe in the irreducibility of consciousness.  It is a mystery.  We
> need to get off of our modern reductionistic thrones or we will die
> before we live
--
Stathis's text:

It's one thing to say that consciousness has evolved to interact with
the world, would not develop in a particular individual without the
appropriate interaction, and wouldn't be of much use without the
possibility of such interaction, but quite another thing to say that
therefore it *must* be this way. We can only deduce that there is a
physical world out there on the basis of patterns of neurons firing in
our cerebral cortex.

[JM]:
"to interact" looks to me as a purpose, what I don't find in the natural 
processes, only consequences (for change). And does your "on the basis" mean 
origination - source? or at the most a parallel phenomenon to who knows 
what? I know you meant it, but you ose us to point out the 'reasons' for 
those firings. In this way somebody could think on an 'inside' generated 
image of "reality" one thinks about.
*
St:
Are you suggesting that a brain with the same
pattern of neurons firing, but without the appropriate environmental
stimulus, would not have exactly the same conscious experience?

[JM]:
Show me, I am an experimentalist.  First show two brains with the same 
pattern of  (ALL!)   neuron firings. Two extracted identical firings in a 
superdupercomplex brain is meaningless.
Then, please, show me (experimentally) the non-identity of environmental 
impacts reaching 2 different brains from the unlimited interaction of the 
totality.
(I wrote already that I do not approve thought-experiments).
You felt like that in the following sentence:
*
St:
That would imply some sort of extra-sensory perception, and there is
no evidence for such a thing. It is perfectly consistent with all the facts
to say that consciousness results from patterns of neurons firing in the
brain, and that if the same neurons fired, the same experience would
result regardless of what actually caused those neurons to fire.

[JM]:
regardless also of the 'rest of the brain'? Would you pick one of the 
billions copmpleting the brainwork complexity and match it to a similar one 
in a different complexity?
But the more relevant question (and I mean it):
What would you identify as (your version) of "consciousness" that "results 
from neuron-fiting" consistent with all the facts?
*
St:
As for consciousness being fundamentally irreducible, I agree
completely.

[JM]:
Consider it a singularity, a Ding an Sich? Your statement looks to me as 
referring to a "thing". Not a process. Or rather a state? (Awareness??)
*
St:
It is a fact that when neurons fire in a particular way, a conscious 
experience results; possibly, complex enough electronic activity in a 
digital computer might also result in conscious experience, although we 
cannot be sure of that. But this does not mean that the conscious experience 
*is* the brain or computer activity, even if it could somehow be shown that 
the physical process is necessary and sufficient for the experience.

[JM]:
I hope you could share with us your version of that "conscious experience" 
as well, which "could" be assigned to a digital computer? What "other" 
activity may a digital computer have
beside "electronic"?
It is hard to show in 'parallel' observed phenopmena whether  one is 
'necessary' for the other, or just observervable in parallel? Maybe "the 
other" is necessary for the 'one'?
If you find that the 'physical' process (firing, or electronic) is 
SUFFICIENT then probably your definition is such that it allows such 
sufficiency.
I may question the complexity of the assigned situation
for such simplification,.
*
St:
Consciousness is something entirely different and, if you like, mysterious, 
in a category of its own.

[JM]:
Now you are talking! Thanks

Stathis Papaiaonnou

[J

RE: The Irreducibility of Consciousness

2006-08-03 Thread Stathis Papaioannou


Tom Caylor writes:

> I totally agree that consciousness requires "outside" interaction.
> That's the way we are.  We are living beings that exist in a world.
> We, as we are, couldn't exist otherwise.  Things happen.  We interact.
> We make other things happen.  The question of consciousness is a
> contradiction.  The question is trying to reduce consciousness to
> something less than it is.  Even Bruno's number world leads him to
> believe in the irreducibility of consciousness.  It is a mystery.  We
> need to get off of our modern reductionistic thrones or we will die
> before we live.

It's one thing to say that consciousness has evolved to interact with 
the world, would not develop in a particular individual without the 
appropriate interaction, and wouldn't be of much use without the 
possibility of such interaction, but quite another thing to say that 
therefore it *must* be this way. We can only deduce that there is a 
physical world out there on the basis of patterns of neurons firing in 
our cerebral cortex. Are you suggesting that a brain with the same 
pattern of neurons firing, but without the appropriate environmental 
stimulus, would not have exactly the same conscious experience? 
That would imply some sort of extra-sensory perception, and there is 
no evidence for such a thing. It is perfectly consistent with all the facts 
to say that consciousness results from patterns of neurons firing in the 
brain, and that if the same neurons fired, the same experience would 
result regardless of what actually caused those neurons to fire.

As for consciousness being fundamentally irreducible, I agree 
completely. It is a fact that when neurons fire in a particular way, 
a conscious experience results; possibly, complex enough electronic 
activity in a digital computer might also result in conscious experience, 
although we cannot be sure of that. But this does not mean that the 
conscious experience *is* the brain or computer activity, even if it 
could somehow be shown that the physical process is necessary and 
sufficient for the experience. Consciousness is something entirely 
different and, if you like, mysterious, in a category of its own.

Stathis Papaiaonnou
_
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~--~~~~--~~--~--~---



The Irreducibility of Consciousness

2006-08-03 Thread Tom Caylor

I totally agree that consciousness requires "outside" interaction.
That's the way we are.  We are living beings that exist in a world.
We, as we are, couldn't exist otherwise.  Things happen.  We interact.
We make other things happen.  The question of consciousness is a
contradiction.  The question is trying to reduce consciousness to
something less than it is.  Even Bruno's number world leads him to
believe in the irreducibility of consciousness.  It is a mystery.  We
need to get off of our modern reductionistic thrones or we will die
before we live.

Tom

Stathis Papaioannou wrote:
> Brent Meeker writes:
>
> > > The brain-with-wires-attached cannot interact with the environment, 
> > > because
> > > all its sense organs have been removed and the stimulation is just coming 
> > > from
> > > a recording. Instead of the wires + recording we could say that there is 
> > > a special
> > > group of neurons with spontaneous activity that stimulates the rest of 
> > > the brain
> > > just as if it were receiving input from the environment. Such a brain 
> > > would have
> > > no ability to interact with the environment, unless the effort were made 
> > > to
> > > figure out its internal code and then manufacture sense organs for it - 
> > > but I
> > > think that would be stretching the definition of "potential interaction". 
> > > In any
> > > case, I don't see how "potential interaction" could make a difference.
> >
> > Yet you had to refer to "stimulate...as if it were receiving input from the
> > environment" to create an example.  If there were no potential interaction
> > there could be no "as if".  So istm that the potential interaction can be an
> > essential part of the definition.  That's not to say that such a definition
> > is right - definitions aren't right or wrong - but it's a definition that
> > makes a useful distinction that comports with our common sense.
>
> It's very difficult to define "potential interaction". With even a completely 
> solipsistic
> computer we could imagine taking readings at various points in the circuit 
> with an
> oscilloscope and/or changing circuit voltages, capacitance, resistance etc. 
> Is the
> fact that we *could* do this enough to make the computer conscious? Or would 
> it
> only be conscious if we had access to its design specifications, so that we 
> could in
> principle communicate with it meaningfully rather than just making random 
> changes?
> What if the human race died out but the computer continued to function, with 
> no
> hope that anyone might ever talk to it? What if the computer had very complex
> (putatively) conscious thoughts, but rather simple input and output, eg. it 
> beeps
> when the counts from a connected geiger counter matches the number it happens
> to be thinking of at the time: would that be enough to make it conscious or 
> does the
> environmental interaction have to match or reflect (or potentially so) the 
> complexity
> of its internal thoughts?
>
> > >If you had
> > > two brains sitting in the dark, identical in anatomy and electrical 
> > > activity except
> > > that one has its optic nerves cut, will one brain be conscious and the 
> > > other not?
> >
> > Where did the brains come from?  Since they had optic nerves can we suppose
> > that they had the potential to see photons and they still have this
> > potential given replacement optic nerves?  Not necessarily.  Suppose one
> > came from a cat that was raised in complete darkness.  We know
> > experimentally that this cat can't see...even when there is light.  The lack
> > of stimulus results in the brain not forming the necessary structures for
> > interpreting signals from the retina.  Now suppose it were raised with no
> > stimulus whatever, even in utero.  I conjecture that it would not "think" at
> > all - although there would be "computation", i.e. neurons firing in some
> > order.  But it would no longer have the potential for interaction; even with
> > its own body.
>
> Yes, the cat would be missing essential brain structures so it would not be
> conscious of light even if you somehow gave it eyes and optic nerves. But I 
> think
> this makes the point that perception/consciousness does not occur in the 
> environment
> but in the brain. If you have the right environmental inputs but the wrong 
> brain,
> there is no perception, whereas if you have the right brain with the neurons 
> firing
> in the right way, but in