On Jan 15, 3:07 pm, Quentin Anciaux <allco...@gmail.com> wrote:
> 2012/1/14 Craig Weinberg <whatsons...@gmail.com>
>
> > Thought I'd throw this out there. If computationalism argues that
> > zombies can't exist, therefore anything that we cannot distinguish
> > from a conscious person must be conscious, that also means that it is
> > impossible to create something that acts like a person which is not a
> > person. Zombies are not Turing emulable.
>
> No, zombies *that are persons in every aspect* are impossible. Not only not
> turing emulable... they are absurd.

If you define them that way then the word has no meaning. What is a
person in every aspect that is not at all a person? The only way the
term has meaning is when it is used to define something that appears
to be a person in every way to an outside observer (and that would
ultimately have to be a human observer) but has no interior
experience. That is not absurd at all, and in fact describes
animation, puppetry, and machine intelligence.

>
>
>
> > If we run the zombie argument backwards then, at what substitution
> > level of zombiehood does a (completely possible) simulated person
> > become an (non-Turing emulable) unconscious puppet? How bad of a
> > simulation does it have to be before becoming an impossible zombie?
>
> > This to me reveals an absurdity of arithmetic realism. Pinocchio the
> > boy is possible to simulate mechanically, but Pinocchio the puppet is
> > impossible.
>
> You conflate two (mayve more) notions of zombie... the only one important
> in the "zombie argument" is this: something that act like a person ****in
> every aspects*** but nonetheless is not conscious... If it is indeed what
> you mean, then could you devise a test that could show that the zombie
> indeed lacks consciousness (remember that *by definition* you cannot tell
> apart the zombie and a "real" conscious person).

No, I think that I have a workable and useful notion of zombie. I'm
not sure how the definition you are trying use is meaningful. It seems
like a straw man of the zombie issue. We already know that
subjectivity is private, what we don't know is whether that means that
simulations automatically acquire consciousness or not. The zombie
issue is not to show that we can't imagine a person without
subjectivity and see that as evidence that subjectivity must
inherently arise from function. My point is that it also must mean
that we cannot stop inanimate objects from acquiring consciousness if
they are a sufficiently sophisticated simulation.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to