On Dec 22, 10:26 pm, meekerdb <meeke...@verizon.net> wrote:
> On 12/22/2011 7:00 PM, Craig Weinberg wrote:
> > On Dec 22, 7:13 pm, Jason Resch<jasonre...@gmail.com> wrote:
> >> This is because of the modularity of our brains:
> >> Different sections of the brain perform specific functions. Some neurons
> >> may serve only as communication links between different regions in the
> >> brain, while others may be involved in processing. I think that the
> >> malfunction and correction of a "communication neuron" might not alter
> >> Alice's experience, in the same way we could correct a faulty signal in her
> >> optic nerve and not expect her experience to be affected. I am less sure,
> >> however, that a neuron involved in processing could have its function
> >> replaced by a randomly received particle, as this changes the definition of
> >> the machine.
> >> Think of a register containing a bit '1'. If the bit is '1' because two
> >> inputs were received and the logical AND operation is applied, this is an
> >> entirely different computation from two bits being ANDed, the result placed
> >> in that register, then (regardless of the result) the bit '1' is set in
> >> that register. This erases any effect of the two input bits, and redefines
> >> the computation altogether. This 'set 1' instruction is much like the
> >> received particles from the super nova causing neurons to fire. It is a
> >> very shallow computation, and in my opinion, not likely to lead to any
> >> consciousness.
> > This study suggests that the mind should not be modeled in that way:
> > "For decades, the cognitive and neural sciences have treated mental
> > processes as though they involved passing discrete packets of
> > information in a strictly feed-forward fashion from one cognitive
> > module to the next or in a string of individuated binary symbols --
> > like a digital computer," said Spivey. "More recently, however, a
> > growing number of studies, such as ours, support dynamical-systems
> > approaches to the mind. In this model, perception and cognition are
> > mathematically described as a continuous trajectory through a high-
> > dimensional mental space; the neural activation patterns flow back and
> > forth to produce nonlinear, self-organized, emergent properties --
> > like a biological organism."
> All of which is emulable by a digital computer.
Emulable to whose judgment? If the implications of studies like this
are true, the native mode of thought is explicitly not digital nor is
it computation. To say that it can be emulated assumes an inherent
pattern recognition capacity which equates continuous, nonlinear, self-
organized properties with discrete digital properties. If you don't
take that equivalence for granted, then there is no emulation. A TV
screen does not emulate a visual image unless you are an organism
which makes sense of it that way, like a person or maybe a cat. Not a
fly or a plant. A moth can make sense of it at a light source, but I
don't think it mistakes the TV screen for an alternate reality like we
can through our fictional interpretations.
> > Their findings support my view that consciousness is biological
> > awareness, not modular computation.
> Except computation is well defined. We know how make something that does
Sure, which is why it's so seductive to jump to the conclusion that
consciousness could be computation alone. It's sentimental, not
scientific to make that assumption.
> Awareness is just using another word for consciousness.
It can be, but in this context I'm trying to refer to the sub-
consciousness of the cellular world only. My whole hypothesis is that
consciousness arises not just through the computations of physical
mechanisms, nor the physical material behind computation, but from the
sense which embodies the relation of the two. Consciousness,
awareness, perception, feeling, sense, and detection are all words for
the same essential thing, but they can imply different levels of
elaboration if we choose to think of them that way. Consciousness is
an awareness of awareness. Awareness is a perception of perceptions,
etc. There is a difference that I'm trying to point out.
Whether there is a scalar difference between them is hard to say. A
neuron may in fact be 'conscious' in it's own frame of reference, but
to us, it's less conscious than us so we can only say that it is alive
and has intelligent behaviors in the nervous system. When you say
"Awareness is just using another word for consciousness.", I translate
it like someone saying "Neurotransmitter activity is just another word
for central nervous system activity". It's not incorrect, but my point
is exactly that consciousness comes from lesser awareness, *not* just
through the activity of the neurological activity associated with that
awareness. Our awareness is not neurotransmitters, it uses them to
feel, but they are not feelings.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at