On Jul 26, 12:38 pm, meekerdb <meeke...@verizon.net> wrote:
> > In order to understand my position you have to let go of the
> > fundamental ontological assumption that mechanics drive feeling.
> You don't seem to grasp that the assumption is *not* that mechanics
> drive feelings.  The "assumption" (one extensively confirmed in
> laboratories) is that mechanics drive actions.  

The terms 'mechanics' and 'actions' are interchangeable. To say that
one drives the other is tautological.

>That's why it is
> possible to imagine a philosophical zombie in which all *actions* are
> the same because all the mechanics are functionally equivalent.  

Yes, an entity which performs actions not driven by feeling is a
philosophical zombie.

> You
> have repeatedly started with this hypothesis and then ended by
> contradicting the very hypothesis you started with.  You apparently
> assume that "feeling" (or "qualia") come from a ghost in the machine.

No, I'm saying that the ghost and the machine are opposite sides of
the same ghost-machine continuum (really more like essence-
essence continuum.)

>   But then, contrary to the assumption that action is drive by
> mechanics, you assert that behavior will be different even those the
> mechanics are the same.  This is simply incoherent.

I understand that it seems that way, because I'm avoiding the
misplaced confidence in behavior being determined by behavior.
Sometimes it is, but sometimes behavior is determined by feeling and
cannot arise in the same way without it. I'm saying that there is no
such thing as the mechanics being the same as something without
feeling, but not because the feeling and mechanics are dependent upon
each other, but because they are in fact the same thing essentially,
just manifested in two diverging existential topologies.

Think about how different a task is when you are strongly motivated
personally to do it. This conversation, for instance. We aren't
getting paid for our time here. If this were about tax laws or
something, you couldn't pay me enough to put this kind of time into
it. My simulation can't understand that. Even if the neurons appeared
to behave the same when you created it, the similarity begins to
degenerate immediately as the lack of semantic significance begins to
overwhelm the limits of what can be decided logically. A person's
external behavior arises from constant expression of the thoughts and
feelings of the self. Duplicating the mechanisms which control the
movements of the face aren't an effective substitute over time, just
as the YouTube person is not a person.

> Fine.  Then you believe there can be philosophical zombies.  Beings that
> are exactly like humans from every 3p perspective, but are different as
> 1p experiencers.   So stop saying, "But their behavior will be different".

One problem is that 'behavior' is too mechanical a word to describe
what people are to each other. Sometimes it matters whether a person's
behavior is intentional or not. That's an interpretive function of
consciousness. We have to decide to what extent others actions are
intentional. If behavior were truly all that went into our 3p
presentation, we wouldn't care about other's motives, we would just
deal with their behavior. So, yes you can have limited functional
equivalence without sentience but you can't necessarily have
indefinite, ongoing equivalence.

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to