Stephen Paul King wrote:

it seems to me that if
minds are purely classical when it would not be difficult for us to imagine,
i.e. compute, what it is like to "be a bat" or any other classical mind. I
see this as implied by the ideas involved in Turing Machines and other
"Universal" classical computational systems.

Ah, but human thinking is a resource-bounded, real-time computational activity.
Despite the massive parallelism of brain computation, we are of necessity
"lazy evaluators" of thoughts. If we weren't, we'd all go mad or become
successful zen practitioners. Sure, we do some free-form associative thought,
and ponder connections subconsciously in the background, but if there's one thing
my AI and philosophy studies have taught me, it is that prioritization
and pruning of reasoning are fundamental keys. There are an infinite
number of implications and probability updates that could be explored, given our
present knowledge. But clearly we're only going to do task-directed, motivationally
directed, sense-data-related subsets of those inferences, and a finite amount of related associative inference in the background to support those.
Therefore, if nothing else, we can't imagine what it is like to be a bat
because we would have to have the reasoning time and resources to explore all of a bat's
experience to get there. And it would also be difficult and probably impossible, because the bat's mind at birth would be preloaded with different "firmware" instinctive behaviours than ours is. Also, the bat's mind would be connected to a different
though analogous set of nerves, sense organs, and motor control systems, and to a differently balanced neurochemical emotional (reasoning prioritization) system.
Regarding emulating another person's experience. The trouble is, again, that you'd
have to emulate all of it from (before) birth, because clearly our minds are built up of our individual experiences and responses to our environment, and our own particularly skewed generalizations from those, as much as from anything else.
And again, you'd have to compensate (emulate) for the subtle but vast differences in the firmware
of each person's brain as it came out of the womb. It's an impossible project in
practical terms, even if the brains are Turing equivalent, which they are.

You don't need to resort to QM to explain the difficulty of emulating other minds.
It's simply a question of combinatorics and vast complexity and subtlety of firmware, experience and knowledge.
Remember on the other hand that human linguistic communication only communicates
"tips of icebergs" of meaning explicitly in the words, and assumes that the utterer
and the reader/listener share a vast knowledge, belief and experience base, and
have similar tendencies toward conjuring up thinking contexts in response to the
prodding of words. (Words are to mentally stored concepts as URLs are to documents).

In order to communicate, we do have to emulate (imagine) our target audience's thought patterns and current thinking context and emotional state, so that we can know which sequence of words is likely to direct their thoughts and
feelings thus and so as we wish to direct them.


Reply via email to