On Feb 7, 1:41 pm, John Clark <johnkcl...@gmail.com> wrote:
> On Tue, Feb 7, 2012  Craig Weinberg <whatsons...@gmail.com> wrote:
>
> > If you are proving that a computer in the position of the man has no
> > understanding then this thought experiment proves it.
>
> How in hell would putting a computer in the position of the man prove
> anything??

Because that is the position a computer is in when it runs a program
based on user inputs and outputs. The man in the room is a CPU.

> The man is just a very very very small part of the Chinese Room,

He and the rule book are the only parts that are relevant to strong
AI.

> all Searle proved is that a tiny part of a system does not have all the
> properties the entire system has. Well duh, the neurotransmitter
> acetylcholine is part of the human brain and would not work without it, but
> acetylcholine does not have all the properties that the entire human brain
> has either.

You are assuming a part/whole relationship rather than form/content
relationship.

>
> > It takes consciousness for granted, like some free floating glow.
>
> Oh now I see the light, literally, consciousness is like some free floating
> glow! Now I understand everything!

Yes, that would solve your problem completely. The rule book and the
man glow a little, and together they make the whole room glow much
more.

>
> > If I understand how to cook and then I walk into a building, does the
> > building, now that it includes me, now know how to cook?
>
> If you didn't know how to cook, if you didn't even know how to boil water
> but the building was now employed at 4 star restaurants preparing delicious
> meals then certainly the building knows how to cook, and you must be a very
> small cog in that operation.

I guess you are talking about a universe where buildings work at
restaurants or something.

>
> > Searle is assuming the common sense of the audience to show them that
> > having a conversation in a language you don't understand cannot constitute
> > understanding
>
> I am having a conversation right now and acetylcholine is in your brain but
> acetylcholine does not understand English, so I am having a conversation in
> English with somebody who does not understand English. Foolish reasoning is
> it not.

Again, you assume a part/whole relation rather than a form/content
relation. A story is not made of pages in a book, it is an experience
which is made possible through the understanding of the symbolic
content of the pages. Anyone can copy the words from one book to
another, or give instructions of which sections of what book to excise
and reproduce, but it doesn't make them a storyteller.

>
> > The organization of my kitchen sink does not change with the temperature
> > of the water coming out of the faucet.
>
> Glad to hear it, now I know who to ask when I need plumbing advice.

Couldn't think of a legitimate counterpoint?

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to