>if you think molecules are needed, that is, that the level of
>substitution includes molecular activity, this too can be emulated by
>a computer

But it can only be emulated in a virtual environment interfacing with
a computer literate human being though. A real mouse will not be able
to live on virtual cheese. Why can't consciousness be considered
exactly the same way, as an irreducible correlate of specific meta-
meta-meta-elaborations of matter?

>All what consciousness (and matter) needs is a sufficiently rich
>collection of self-referential relations. It happens that the numbers,
>by the simple laws of addition and multiplication provides already
>just that. Adding some ontological elements can only make the mind
>body problem more complex to even just formulate.

Information is not consciousness. Energy is the experience of being
informed and informing, but it is not information. This is why a brain
must be alive and conscious (not in a coma) to be informed or inform,
and why a computer must be turned on to execute programs, or a
mechanical computing system has to have kinetic initialization, etc.
The path that energy takes determines the content of the experience to
some extent, but it is the physical nature of the materials through
which the continuous sense of interaction occurs which determine the
quality or magnitude of possible qualitative elaboration (physical,
chemo, bio, zoo-physio, neuro, cerebral) of that experience. Physical
will take you to detection, chemo to sense, bio to feeling, zoo to
emotion, neuro to cognition, cerebral to full abstraction (colloquial
terms here, not asserting a formal taxonomy). All are forms of
awareness. Consciousness implies awareness of awareness which maybe
comes at the neuro or  cerebral level, maybe lower? It has nothing to
do with the complexity of the path that the energy takes. Complexity
is an experience, not a discrete ontological condition.

>Adding some ontological elements can only make the mind
>body problem more complex to even just formulate.

This makes me think that you are sentimental about protecting the
simplicity of an abstract formula, rather than faithfully representing
the problem. I'm not especially interested in the 'easy' problem of
consciousness. It's a worthwhile problem, to be sure, it's just not my
thing. I do think, however, that if we can accurately describe the
pattern of what the hard problem seems to arise from, it may have
implications for both the easy and hard problems. At worst, my view
limits the aspirations of inorganic materials to simulate
consciousness, but I don't see that as anything more than an
identification of how the cosmos works. We don't want to create
consciousness, we can do that already by reproducing. We want an
omnipotent glove for the hand of consciousness that we already have.
That seems easier to accomplish if we are not convincing ourselves
that feelings must be numbers.

Craig


On Jul 21, 9:31 am, Bruno Marchal <[email protected]> wrote:
> On 21 Jul 2011, at 12:50, Craig Weinberg wrote:
>
> > I don't have a problem with living neurological systems extending
> > their functionality with mechanical prosthetics, it's the other way
> > around that is more of an issue. People driving cars doesn't mean cars
> > driving human minds.
>
> Sure, but we do both: robots with neurons, and animals, including  
> humans, with the brain partially replaced by artificial neurons.
> Anyway, if you think molecules are needed, that is, that the level of  
> substitution includes molecular activity, this too can be emulated by  
> a computer. The only way to negate computationalism consists in  
> pretending there is some NON Turing-emulable activity going on in the  
> brain, and relevant for consciousness. In that case, there is no  
> possible level of digital substitution.
>
> Note that all physical phenomena known today are Turing emulable,  
> even, in some sense, quantum indeterminacy (in the QM without  
> collapse) where the indeterminacy is a first person view of a  
> digitalisable self-multiplication experiment.
>
> All what consciousness (and matter) needs is a sufficiently rich  
> collection of self-referential relations. It happens that the numbers,  
> by the simple laws of addition and multiplication provides already  
> just that. Adding some ontological elements can only make the mind  
> body problem more complex to even just formulate.
>
> Bruno
>
>
>
>
>
>
>
>
>
>
>
> > On Jul 21, 5:48 am, Bruno Marchal <[email protected]> wrote:
> >> On 21 Jul 2011, at 00:58, Stathis Papaioannou wrote:
>
> >>> On Thu, Jul 21, 2011 at 4:40 AM, Craig Weinberg
> >>> <[email protected]> wrote:
> >>>> Chickens can walk around for a while without a head also. It  
> >>>> doesn't
> >>>> mean that air is a viable substitute for a head, and it doesn't  
> >>>> mean
> >>>> that the head isn't producing a different quality of awareness than
> >>>> it
> >>>> does under typical non-mortally wounded conditions.
>
> >>> I think you have failed to address the point made by several  
> >>> people so
> >>> far, which is that if the replacement neurons can interact with the
> >>> remaining biological neurons in a normal way, then it is not  
> >>> possible
> >>> for there to be a change in consciousness. The important thing is
> >>> **behaviour of the replacement neurons from the point of view of the
> >>> biological neurons**.
>
> >> And interfacing biological neurons with non biological circuits is  
> >> not
> >> sci.fi., nowadays.
>
> >>http://www.youtube.com/watch?v=1-0eZytv6Qk&feature=related
>
> >>http://www.youtube.com/watch?v=1QPiF4-iu6g&feature=fvwrel
>
> >>http://www.youtube.com/watch?v=-EvOlJp5KIY
>
> >> This is NOT a proof, nor even strong evidences for computationalism,
> >> but it is strong evidence that humans will believe in comp, and
> >> practice it, no matter what.
>
> >> Bruno
>
> >>http://iridia.ulb.ac.be/~marchal/
>
> > --
> > You received this message because you are subscribed to the Google  
> > Groups "Everything List" group.
> > To post to this group, send email to [email protected].
> > To unsubscribe from this group, send email to 
> > [email protected]
> > .
> > For more options, visit this group 
> > athttp://groups.google.com/group/everything-list?hl=en
> > .
>
> http://iridia.ulb.ac.be/~marchal/

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to