James N Rose wrote:
> Just to throw a point of perspective into this
> conversation about mimicking qualia.
> I posed a thematic question in my 1992 opus
> "Understanding the Integral Universe".
> "What of a single celled animus like an amoeba or paramecium?
> Does it 'feel' itself? Does it sense the subtle variations
> in its shape as it bumps around in its liquid world? Does it
> somehow note changes in water pressure around it? Is it
> always "hungry"? What drives a single celled creature to eat?
> What "need", if any is fulfilled? Is it due to an internal
> pressure gradient in it's chemical metabolism? Is there a
> resilience to its boundary that not only determines its
> particular shape, whether amoebic or firm, but that variations
> in that boundary re-distribute pressures through its form to
> create a range of responsive actions? And, because it is
> coherent for that life form, is "this" primal consciousness?
> How far down into the structure of existence can we reasonably
> extrapolate this? An atom's electron cloud responds and interacts
> with its level of environment, but is this consciousness? We
> cannot personify, and therefore mystify, all kinetic functions
> as different degrees of consciousness; at least not at this point.
> Neither, can we specify with any certainty a level where
> consciousness suddenly appears, where there was none before."
> "UIU"(c)ROSE 1992 ; 02)Intro section.
If consciousness is the creation of an inner narrative to be stored in
long-term memory then there are levels of consciousness. The amoeba forms no
memories and so is not conscious at all. A dog forms memories and even has some
understanding of symbols (gestures, words) and so is conscious. In between
there are various degress of consciousness corresponding to different
complexity and scope of learning.
> "Pain" is a net-collective qualia, an 'other-tier' cybernetic
> emerged phenomenon. But it is -not unrelated- to phenomena
> like basic EM field changes and 'system's experiences' in those
> precursive tiers.
> Also, "pain" (an aspect of -consciousness-), has to be understood
> in regard to the panorama of 'kinds-of-sentience' that any given
> system/organism has, embodies, utilizes or enacts.
> In other words, it would be wrong to dismiss the presence of
> 'pain' in autonomic nervous systems, simply because the
> cognitive nervous system is 'unaware' of the signals or
> the distress situation generating them.
This seems to depend on whether you define pain to be the conscious experience
of pain, or you allow that the bodily reaction is evidence of pain in some more
general sense. I think Stathis posed the question in terms of conscious
experience. There's really no doubt that one can create and artificial system
that reacts to distress; as in my example of a modern aircraft.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at