On 2/21/2012 8:32 AM, Terren Suydam wrote:
Bruno and others,

Here's a thought experiment that for me casts doubt on the notion that
consciousness requires 1p indeterminacy.

Imagine that we have scanned my friend Mary so that we have a complete
functional description of her brain (down to some substitution level
that we are betting on). We run the scan in a simulated classical
physics. The simulation is completely closed, which is to say,
deterministic. In other words, we can run the simulation a million
times for a million years each time, and the state of all them will be
identical. Now, when we run the simulation, we can ask her (within the
context of the simulation) "Are you conscious, Mary?  Are you aware of
your thoughts?" She replies yes.

Next, we tweak the simulation in the following way. We plug in a
source of quantum randomness (random numbers from a quantum random
number generator) into a simulated water fountain. Now, the simulation
is no longer deterministic. A million runs of the simulation will
result in a million different computational states after a million
years. We ask the same questions of Mary and she replies "yes".

In the deterministic scenario, Mary's computational state is traced an
infinite number of times in the UD*, but only because of the infinite
number of ways a particular computational state can be instantiated in
the UD* (different levels, UD implementing other UDs recursively,
iteration along the reals, etc). It's a stretch however to say that
there is 1p indeterminacy, because her computational state as
implemented in the simulation is deterministic.

In the second scenario, her computational state is traced in the UD*
and it is clear there is 1p indeterminacy, as the splitting entailed
by the quantum number generator "brings Mary along", so to speak.

So if Mary is not conscious in the deterministic scenario, she is a
zombie. The only way to be consistent with this conclusion is to
insist that the substitution level must be at the quantum level.

If OTOH she is conscious, then consciousness does not require 1p indeterminacy.

But is it really either-or? Isn't it likely there are different kinds and degrees of consciousness. I'm not clear on what Bruno's theory says about this. On the one hand he says all Lobian machines are (equally?) conscious, but then he says it depends on the program they are executing.

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to