On 9/13/2011 4:23 PM, Craig Weinberg wrote:
On Sep 13, 3:23 pm, meekerdb<meeke...@verizon.net>  wrote:
On 9/13/2011 12:00 PM, Craig Weinberg wrote:

It's easy to assume that it helps, just as it's easy for me to assume
that we have free will. If we don't need our conscious mind to make
decisions, then we certainly don't need the fantasyland associated
with our conscious minds to help with that process. Think of building
a robot that walks around and looks for food and avoids danger. Why
would it help to construct some kind of Cartesian theater inside of
it? Functionally, there is no reasonable explanation for perception or
experience, especially if you believe in determinism.
It would help, even be essential, to the robot learning for it to remember 
things.  But
not just everything.  It needs to remember important things, like what it was 
doing just
before it fell down the stairs.  So you design it to continually construct a 
narrative
history and if something important happens you tuck that piece of narrative 
history into a
database for future reference by associative memory ('near stairs'?  don't back 
up).  This
memory consists of connected words learned by the speech/hearing module and 
images.  For
efficiency you use these same modules for associative construction of the 
narrative memory
and for recall.  Hence part of the same processing is used for recall and 
cogitation as
well as perception and learning.  That's why thinking has similarity to 
perception, i.e.
sitting in a Cartesian theater.
Oh, I agree that there is a functional advantage to perception, it's
just not sufficient to explain the existence of it. Our immune system
needs to learn to remember things too. It may very well have a
narrative history of pathogens and strategic options, but there is no
compelling reason to assume that there is a theatrical presentation
going on which is comparable to our Cartesian theater.

The operative word is "comparable". Of course it's not going to be "comparable" except in very broad functional terms, as you have acknowledged above.

Not only would
such a thing be unnecessary and probably detrimental as far as
computational overhead, but there really is no plausible raw material
for this perception to be manifested through.

It's not necessary for a robot, but evolution has to work with what's available. And it may be more efficient than having separate processing for cogitation and perception.

It would be much easier
to just make the robot construct omniscient telepathy

That's easy?  The please explain how to do it.

than to somehow
conjure an unprecedented thing like feeling or color out of
mathematical function.

You don't have to 'conjure' them.  They are inherent in perception.

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to