On 29 Mar 2015, at 08:54, Bruce Kellett wrote:

meekerdb wrote:
On 3/28/2015 11:02 PM, Bruce Kellett wrote:
meekerdb wrote:

The calculation written out on paper is a static thing, but the result of that calculation might still be part of a simulation that produces consciousness. Though, unless Barbour is right and the actuality of time can be statically encoded in his 'time capsules (current memories of past instances)', I was thinking in terms of a sequence of these states (however calculated).
Yes, I agree that the computation should not have to halt (compute a function) in order to instantiate consciousness; it can just be a sequence of states. Written out on paper it can be a sequence of states ordered by position on the paper. But that seems absurd, unless you think of it as consciousness in the context of a world that is also written out on the paper, such that the writing that is conscious is /*conscious of*/ this written out world.

My present conscious state includes visual, auditory and tactile inputs -- these are part of the simulation. But they need simulate only the effect on my brain states during that moment -- they do not have to simulate the entire world that gave rise to these inputs. The recreated conscuious state is not counterfactually accurate in this respect, but so what? I am reproducing a few conscious moments, not a fully functional person.

But in the MGA (or Olympia) we are asked to consider a device which is a conscious AI and then we are led to suppose a radically broken version of it works even though it is reduced to playing back a record of its processes. I think the playback of the record fails to produce consciousness because it is not counterfactually correct and hence is not actually realizing the states of the AI - those states essentially include that some branches were not taken. Maudlin's invention of Klara is intended to overcome this objection and provide a counterfactually correct but physically inert sequence of states. But I think it Maudlin underestimates the problem of context and the additions necessary for counterfactual correctness will extend far beyond "the brain" and entail a "world". These additions come for free when we say "Yes" to the doctor replacing part of our brain because the rest of the world that gave us context is still there. The doctor doesn't remove it.

In the "yes doctor" scenario as reported by Russell, it talks only about replacing your brain with an AI program on a computer. It does not mention connecting this to sense organs capable of reproducing all the inputs one normally gets from the world. If this is not clearly specified, I would certainly say 'No' to the doctor. There is little point or future in being a functioning brain without external inputs. As I recall sensory deprivation experiments, subjects rapidly subside into a meaningless cycle of states -- or go mad -- in the absence of sensory stimulation.

?

yes doctor assumes your brain is replaced in your skull, and correctly interfaced with the other organs. I guess that can be implicit in some presentation. That is step 0, or the definition of comp. Then, at step 6, you are plunged completely in a computer, but it is supposed to simulate very well Moscow and Washungton, for some finite time. There is no sensory deprivation in all situations involved in the reasoning.

Bruno







Bruce

--
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to