On Feb 2, 4:05 pm, John Clark <johnkcl...@gmail.com> wrote:
> On Thu, Feb 2, 2012 Craig Weinberg <whatsons...@gmail.com> wrote:
> > My view is that the whole idea that there can be a 'functional equivalent
> > of emotions' is completely unsupported. I give examples of puppets
> A puppet needs a puppeteer, a computer does not.

Yes it does. It needs a user at some point to make any sense at all of
what the computer is doing. An abacus is a computer. Left to it's own
devices it's just a rectangle of wood and bamboo or whatever. Attach a
motor to it that does computations with it, and you still have a
rectangle of wood with a metal motor attached to it clicking out
meaningless non-patterns in the abacus with nothing to recognize the
patterns but the occasional ant getting injured by a rapidly sliding

> > movies, trashcans that say THANK YOU, voicemail...all of these things
> > demonstrate that there need not be any connection at all between function
> > and interior experience.
> For all these examples to be effective you would need to know that they do
> not have a inner life, and how do you know they don't have a inner life.
> You know because they don't behave as if they have a inner life. Behavior
> is the only tool we have for detecting such things in others so your
> examples are useless.

No, because people in a vegetative state do sometimes have an inner
life despite their behavior. It is our similarity to and familiarity
with other humans that encourages us to give them the benefit of the
doubt. We go the extra mile to see if we can figure out if they are
still alive. Most of us don't care as much whether a steer is alive
when we are executing it for hamburger patties or a carrot feels
something when we rip it out of the ground. With that in mind, we
certainly don't owe a trashcan lid any such benefit of the doubt. Like
a computer, it is manufactured out of materials selected specifically
for their stable, uniform, inanimate properties. I understand what you
mean though, and yes, our perception of something's behavior is a
primary tool to how we think of it, but not the only one. More
important is the influence of conventional wisdom in a given society
or group. We like to eat beef so most of us rationalize it without
much thought despite the sentient behavior of steer. We like the idea
of AI so we project the possibility of feeling and understanding on it
- we go out of our way to prove that it is possible despite the
automatic and mechanical behavior of the instruments.

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to