On 4/1/2012 14:33, David Nyman wrote:
Bruno, when you talk about the doctor offering one a replacement brain
you usually describe the substitute as digital, although I think you
have sometimes just said that it is artificial.  My recent remarks
about "game physics" got me thinking about this distinction, if indeed
there is one.

Suppose Dick's friend Harry, having been previously diagnosed with an
incurable brain cancer, has had an artificial brain installed. The
doctor tells Dick that he has replaced Harry's brain with a (very
sophisticated!) battery-driven clockwork substitute.  Harry tells Dick
that the replacement has been entirely successful: "After the
operation I felt a little woozy at first, but I feel great now.  My
memory is excellent - if anything better than before - and my
appreciation of the finer things in life is as vivid as ever."  Dick
is a bit sceptical at first (his faith in clockwork has been
prejudiced by a rather unreliable fake Rolex he bought in Hong Kong)
but over a period of several months of careful observation he finds he
can't distinguish any difference whatsoever between Harry's new
clockwork personality and his former self.  Their friendship is

This turns out to be just as well, because - horror of horrors - Dick
is shortly afterwards also diagnosed with a terminal brain condition.
Should he now be willing to submit to the same procedure as Harry?  He
is still a little sceptical of clockwork, but the evidence of Harry's
successful transformation is very difficult to discount, and the
doctor shows him several other "before and after" videos with equally
convincing outcomes. The artificial brains may be clockwork, but the
doctor assures him it is clockwork of unprecedented  sophistication
and precision, unheard of even in the hallowed halls of Swiss
horology. Dick has stumbled across the Everything List, and is rather
persuaded by the computational theory of mind.  Trouble is, the doctor
is not of this persuasion.  He tells Dick that the goal of the
operation is only to substitute a clockwork analogue for the
electro-chemical mechanisms of his organic brain, and that on this
basis Dick can confidently expect that the same inputs will reliably
elicit the same responses as before.  Hearing this, Dick is now
worried that, however successful the replacement of Harry's brain has
been behaviourally, his friend is now essentially a mindless clockwork

Since he certainly doesn't want to suffer such an indignity, should he
say no to the doctor?  The question that troubles Dick is whether,
assuming comp, he should accept a genuinely
behaviourally-indistinguishable body, irrespective of its brain being
organic or clockwork, as an equivalent "avatar" according to the rules
of the comp game-physics.  If so, Dick should have no reason not to
accept a behaviourally-indistinguishable, clockwork-equipped body as
enabling his continued manifestation relative to the familiar
environments to which he has become so emotionally attached.  Time is
short, and he must act.  What should he do?


It seems to me the question is if someone should bet in COMP.

If Dick had trouble assigning consciousness to Harry because Dick was a solipsist then he might have a hard time betting on COMP. Of course, your post does not suggest that Dick had such an opinion, but it is just one of many unfalsifiable viewpoints (since one cannot know of any other consciousness than their own), but not something which we think is likely (by induction on observed behavior and its similarity to our internal states).

If Dick thinks mechanism (COMP) is true, that is, the subjective experience that he has corresponds to the inside view of some abstract structure or process which is implemented in his brain. That is, that his brain does not have any magical properties that make it conscious and the fact that conscious experience that one has appear to place us relative to a physical brain (by induction).

By induction we can also observe that changing our brain through medicine or drugs or other methods (for example, consider a thought experiment about the nature of consciousness when only small parts change: http://consc.net/papers/qualia.html ) also changes our conscious experience, but it shouldn't if whatever we change doesn't change our functionality. Not accepting that will result in all kinds of strange partial philosophical zombies, which to many people don't make sense, but Dick would have to decide for himself if they make sense for him or not - maybe even experiment on himself, after all, the COMP doctor is available.

Dick should also consider the UDA and the proof that mechanism is incompatible with materialism (since Dick assumes the existence of mind and consciousness by default, I'm not considering that option here).

If Dick thinks COMP is worth betting on, he now only has to worry about one thing: did his doctor choose the right substitution level? If the substitution level is higher than the correct one, he might have slightly different experiences than before, despite not being able to tell so anymore, or more worrying (or possibly the opposite, as it opens some very interesting possibilities up), such an incorrect bet on the level might change his measure, at worst, making his experience less stable (jumpy). Behaviorally identical behavior up to some limited range of time does not necessarily mean a correctly guessed substitution level (can you show that 2 functions are identical?). What if the level is too low and it depends too much on entangled states (generalized brain idea) to stay stable? One could say that the same may be true for his other new digital physical brain, but what if he constantly changes substrates like a proper substrate independent mind would like to (VRs and other environments)? Maybe he shouldn't worry that much anyway, we can't really know if our experience is that continuous either, we have discontinuities when we sleep (as well as more exotic cases).

It comes down to what Dick finds more plausible or which heuristics he is willing to use in selecting what theory is more likely to be true. In a way, it's a matter of 'theology' or 'religion', as this belief will have to be taken on faith, even if very plausible given our observations. Also, what does he have to lose? If he believes he would die without it, he would just have to evaluate the choices that would make it more likely to get what he wants (does he want to live? or does he want to avoid the prejudices of vitalists which believe in some other theory of mind even if it could cost him his life?).

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to