On Thu, Feb 9, 2012 Craig Weinberg <whatsons...@gmail.com> wrote:
> The rule book is the memory.

Yes but the rule book not only contains a astronomically large database it
also contains a super ingenious artificial intelligence program; without
those things the little man is like a naked microprocessor sitting on a
storage shelf, its not a brain and its not a computer and its not doing one
damn thing.


> >The contents of memory is dumb too - as dumb as player piano rolls.


That's pretty dumb. but the synapses of the brain are just as dumb and the
atoms they, and computers and everything else, are made of are even
dumber.

> The two together only seem intelligent to Chinese speakers outside the
> door


Only?! Einstein only seemed intelligent to scientifically literate speakers
in the outside world. It "seems" that, as you use the term, seeming
intelligent is as good as being intelligent. In fact it seems to me that
believing intelligent actions are not a sign of intelligence is not very
intelligent.

> A conversation that lasts a few hours could probably be generated from a
> standard Chinese phrase book, especially if equipped with some useful
> evasive answers (a la ELIZA).


You bring up that stupid 40 year old program again? Yes ELIZA displayed
little if any intelligence but that program is 40 years old! Do try to keep
up. And if you are really confident in your ideas push the thought
experiment to the limit and let the Chinese Room produce brilliant answers
to complex questions, if it just churns out ELIZA style evasive crap that
proves nothing because we both agree that's not very intelligent.

> The size isn't the point though.


I rather think it is. A book larger than the observable universe and a
program more brilliant than any written, yet you insist that if understand
is anywhere in that room it must be in the by far least remarkable part of
it, the silly little man.  And remember the consciousness that room
produces would not be like the consciousness you or I have, if would take
that room many billions of years to generate as much consciousness as you
do in one second.

> Speed is a red herring too.
>

No it is not and I will tell you exactly why as soon as the sun burns out
and collapses into a white dwarf. Speed isn't a issue so you have to
concede that I won that point.

 > if it makes sense for a room to be conscious, then it makes sense that
> anything and everything can be conscious


Yes, providing the thing in question behaves intelligently.  We only think
our fellow humans are conscious when they behave intelligently and that's
the only reason we DON'T think they're conscious when they're sleeping or
dead; all I ask is that you play by the same rules when dealing with
computers or Chinese Rooms.

>> However Searle does not expect us to think it odd that 3 pounds of grey
>> goo in a bone vat can be conscious
>>
>
> Because unlike you, he [Searl] is not presuming the neuron doctrine. I
> think his position is that consciousness cannot solely because of the
> material functioning of the brain and it must be something else.


And yet if you change the way the brain functions, through drugs or surgery
or electrical stimulation or a bullet to the head, the conscious experience
changes too.  And if the brain can make use of this free floating glowing
bullshit of yours what reason is there to believe that computers can't also
do so? I've asked this question before and the best you could come up with
is that computers aren't squishy and don't smell bad so they can't be
conscious. I don't find that argument compelling.

> We know the brain relates directly to consciousness, but we don't know
> for sure how.


If you don't know how the brain produces consciousness then how in the
world can you be so certain a computer can't do it too, especially if the
computer is as intelligent or even more intelligent than the brain?

> We can make a distinction between the temporary disposition of the brain
> and it's more permanent structureor organization.
>

A 44 magnum bullet in the brain would cause a change in brain organization
and would seem to be rather permanent. I believe such a thing would also
cause a rather significant change in consciousness. Do you disagree?

  John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to