On Jul 25, 10:53 pm, meekerdb <meeke...@verizon.net> wrote:
> > Not zombie neurons, just zombie imitation neurons. A natural neuron
> > could not be a zombie, but you could make a neuron that you think
> > should function like a natural neuron and it would not be able to be
> > well integrated into the person's consciousness.
>
> That's beside the point.  The only requirement is that it be integrated
> into the person's nervous system.  Then the person will behave just as
> before.  So your theory is that a person with artificial neurons
> integrated into their nervous system will have altered or zero
> consciousness yet behave perfectly normally.

I think that they could appear to behave normally, but given enough
time their lack of consciousness will show through. People sleepwalk,
have blackouts, amnesia etc. This would just be a more subtle form of
it. Again, depending on how artificial it is.

> > If the imitation is
> > biological, genetic, and atomic, then it is a very good imitation and
> > I would expect a good chance for success, even if alternate gene
> > sequences or cell architectures were employed. If you cut out the
> > entire biochemical layer, and try to reproduce human consciousness
> > with only solid state electronics, you're going to get different
> > results which would exclude the ability to feel or understand human
> > experience in the absence of a living human.
>
> But there will be no way to know this is the case since the person with
> the substitute cell architecture will, ex hypothesi, behave exactly the
> same - including reporting that they feel the same feelings.  So your
> claim that you will get different results is untestable.

The scenario is flawed from the start. What a person chooses to report
is an aspect of their aggregate subjectivity, it does not originate in
the behavior of the neurons in their brain, the behavior of the
neurons are a consequence of the feelings. If you want the artificial
brain to report feelings, you are going to have to entrain the program
to behave that way.

In order to understand my position you have to let go of the
fundamental ontological assumption that mechanics drive feeling. They
can drive feeling, and feeling can drive mechanics, but if you base a
substitute cell architecture on nothing but the logical functionality
you can observe from 3p perspective, you're not going to get something
that exists as a 1p entity.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to