On 01 Apr 2012, at 18:12, David Nyman wrote:

On 1 April 2012 16:48, Bruno Marchal <marc...@ulb.ac.be> wrote:

David, if Dick does not have the impression that Harry has became a sort of zombie of some kind, for a time, I would suggest he trusts Harry and his doctor. If he is prepared to bet on comp. Once he bet on comp, the nature of the ultimate consituants of what do the computation, relatively to its usual
environments, does not matter.

Yes, once one has bet on comp, the distinction between "software" and
"hardware" is one of relative level rather than fundamental ontology.
You appear to confirm my thought that the best evidence that the
replacement brain implements the right computation is its behaviour,
and hence that of the recipient.  So Dick can only rely on his
assessment of Harry's behaviour to give him confidence for his own bet
on this particular doctor's expertise.

Yes, I am afraid that this will be all we have to rely on. And the situation might be difficult with the first artificial brain, with people saying after some month that they have survive but that something is different, without being able to be precise on what it is, like explaining the effect of a slight alcohol buzz to someone having never drank.

If you assume comp, you should not be afraid to be mechanical at some level, because that is what is stipulated at the start. But you might fear that the doctor is enough close to the right level for having a behavior close to normal, but with slight difference, which might, or not matter.

Would you say yes to Harris doctor, to get the same model of artificial brain, in case Harris behave very differently, but still say that he is glad having done the transplant. What if Harris says to Dick, "look, it is not as good as my organic brain, but I still enjoy a lot of things, and it seems to me better than being dead, so I would suggest you go for it ...".

Hard question. But not unrelated to deciding to suicide or not after a dramatic accidents. Real life is full of very hard questions. Comp will leads to more and more hard question of that type.



However, given the potential
for getting the substitution level wrong in some way, and the finite
nature of any possible test, just how much can Dick trust that his
friend hasn't been affected in some hard-to-detect way, despite all
his assurances to the contrary?  As you observe, this may well become
a pragmatic, as opposed to merely philosophical, issue in the
not-too-distant future.  Suffice it to say, I'm unlikely to be an
early adopter!

Very wise decision. The pioneer of terrestrial immortality might suffer indeed, first from the inadequacy of the first artificial brains, including not quite correct choice of level, second from the inadequacy of the secret encryption making them prone to be reconstituted by the soul pirates of the future. An artificial brain is like a password, you better have to keep it hidden.

Bruno


David

David, acw,


On 01 Apr 2012, at 16:36, acw wrote:

On 4/1/2012 14:33, David Nyman wrote:

Bruno, when you talk about the doctor offering one a replacement brain you usually describe the substitute as digital, although I think you
have sometimes just said that it is artificial.  My recent remarks
about "game physics" got me thinking about this distinction, if indeed
there is one.

Suppose Dick's friend Harry, having been previously diagnosed with an
incurable brain cancer, has had an artificial brain installed. The
doctor tells Dick that he has replaced Harry's brain with a (very
sophisticated!) battery-driven clockwork substitute. Harry tells Dick
that the replacement has been entirely successful: "After the
operation I felt a little woozy at first, but I feel great now.  My
memory is excellent - if anything better than before - and my
appreciation of the finer things in life is as vivid as ever." Dick
is a bit sceptical at first (his faith in clockwork has been
prejudiced by a rather unreliable fake Rolex he bought in Hong Kong) but over a period of several months of careful observation he finds he
can't distinguish any difference whatsoever between Harry's new
clockwork personality and his former self.  Their friendship is
undiminished.

This turns out to be just as well, because - horror of horrors - Dick is shortly afterwards also diagnosed with a terminal brain condition. Should he now be willing to submit to the same procedure as Harry? He is still a little sceptical of clockwork, but the evidence of Harry's
successful transformation is very difficult to discount, and the
doctor shows him several other "before and after" videos with equally convincing outcomes. The artificial brains may be clockwork, but the
doctor assures him it is clockwork of unprecedented  sophistication
and precision, unheard of even in the hallowed halls of Swiss
horology. Dick has stumbled across the Everything List, and is rather persuaded by the computational theory of mind. Trouble is, the doctor
is not of this persuasion.  He tells Dick that the goal of the
operation is only to substitute a clockwork analogue for the
electro-chemical mechanisms of his organic brain, and that on this
basis Dick can confidently expect that the same inputs will reliably
elicit the same responses as before.  Hearing this, Dick is now
worried that, however successful the replacement of Harry's brain has been behaviourally, his friend is now essentially a mindless clockwork
mechanism.

Since he certainly doesn't want to suffer such an indignity, should he
say no to the doctor?  The question that troubles Dick is whether,
assuming comp, he should accept a genuinely
behaviourally-indistinguishable body, irrespective of its brain being organic or clockwork, as an equivalent "avatar" according to the rules
of the comp game-physics.  If so, Dick should have no reason not to
accept a behaviourally-indistinguishable, clockwork-equipped body as
enabling his continued manifestation relative to the familiar
environments to which he has become so emotionally attached. Time is
short, and he must act.  What should he do?

David


It seems to me the question is if someone should bet in COMP.


David, I agree with acw. If you bet in comp, it does not matter the computer is run with clockworks, or with the chinese population, abstracting to the facts that the artificial brains run in "real time", which means relatively
to us and the neighborhood.
So the real question, admitting the "truth" of comp, will rely in the choice
of the substitution level.
Now, it seems to me Dick should ask Harry and Harry's wife and friends if everything is fine with him. Then it is will be only a matter of personal conviction, and bet on the level of substitution. (abstracting from the fact that the real choice will be between some PC or APPLE, with different price, and softs, and the applications for the galactic-net, on which you can download yourself with reasonable self-quantum cryptographical protection.





If Dick had trouble assigning consciousness to Harry because Dick was a solipsist then he might have a hard time betting on COMP. Of course, your post does not suggest that Dick had such an opinion, but it is just one of
many unfalsifiable viewpoints (since one cannot know of any other
consciousness than their own), but not something which we think is likely (by induction on observed behavior and its similarity to our internal
states).

If Dick thinks mechanism (COMP) is true, that is, the subjective
experience that he has corresponds to the inside view of some abstract structure or process which is implemented in his brain. That is, that his brain does not have any magical properties that make it conscious and the fact that conscious experience that one has appear to place us relative to a
physical brain (by induction).

By induction we can also observe that changing our brain through medicine or drugs or other methods (for example, consider a thought experiment about
the nature of consciousness when only small parts change:
http://consc.net/papers/qualia.html ) also changes our conscious experience, but it shouldn't if whatever we change doesn't change our functionality.


I would say "the functionality of our parts at some level".




Not accepting that will result in all kinds of strange partial
philosophical zombies, which to many people don't make sense, but Dick would have to decide for himself if they make sense for him or not - maybe even
experiment on himself, after all, the COMP doctor is available.


The cautious answer. I will try the artificial brain for one week, and if it does not work well, I will come back to my organic brain, for the month or
two I can live.
The result of the transplant: Dick makes a bg smile, and said "wonderful! it works", then made again a big smile and said again "wonderful! it works",
and so one ...
I mean the cautious answer is not really operational. No zombie can say "Oh,
gosh! I have become a zombie!".

This illustrates we need indeed some faith, but of course, we need that with
organic bodies. We are just more used to them.





Dick should also consider the UDA and the proof that mechanism is
incompatible with materialism (since Dick assumes the existence of mind and
consciousness by default, I'm not considering that option here).

If Dick thinks COMP is worth betting on, he now only has to worry about
one thing: did his doctor choose the right substitution level?


Exactly.
Note also this. If the clock mechanism simulates the whole brain at the electro-chemical level, from a fidel copy at that level, then the clock
mechanism will run a brain ... developing cancer.
In that case, some higher level might be more sane, or the doctor has to fix the cancer, in some way, like adding lot of THC routine around the cancer cells, but then smoking pot or THC injection might be more efficacious and
affordable than an arifticial brain.





If the substitution level is higher than the correct one, he might have slightly different experiences than before, despite not being able to tell so anymore, or more worrying (or possibly the opposite, as it opens some very interesting possibilities up), such an incorrect bet on the level might change his measure, at worst, making his experience less stable (jumpy).


There is of course a wide variety of possible experiences for a too high level. From behaving like an amnesic baby, to a large variety of anosognosia
(like blind and unaware of it, that is: blind + blind-related-notion
amnesy).
And also, having access to more "normal world". Amnesy by itself enlarge the spectrum of accessible realities. They are very plausibly jumps, but with
large plateau of normality, normally ...





Behaviorally identical behavior up to some limited range of time does not necessarily mean a correctly guessed substitution level (can you show that 2 functions are identical?). What if the level is too low and it depends too
much on entangled states (generalized brain idea) to stay stable?


If it is too low, but correct, normally this would not change his behavior, but might restrict his possibilities in the "after-life". very complex question here. Normally too much low, should not be a problem (except for the waste of money), unless some low level feature of matter are used to emulates some classical feature. So "too low" needs clarification, because
it can be interpreted in many ways.




One could say that the same may be true for his other new digital physical brain, but what if he constantly changes substrates like a proper substrate independent mind would like to (VRs and other environments)? Maybe he shouldn't worry that much anyway, we can't really know if our experience is that continuous either, we have discontinuities when we sleep (as well as
more exotic cases).

It comes down to what Dick finds more plausible or which heuristics he is willing to use in selecting what theory is more likely to be true. In a way, it's a matter of 'theology' or 'religion', as this belief will have to be
taken on faith, even if very plausible given our observations.


If Harry remains able to be trusted by Dick, Dick should trust Harry.

As things are going now, we might not have the choice in the matter in the future. We have to separate health from the state, like we separate (or we try) religion and state. A prolife doctor might give you an artificial brain even without asking you. The comp ethics is that comp is a personal choice,
and eventually, even the choice of level is personal.

David, if Dick does not have the impression that Harry has became a sort of zombie of some kind, for a time, I would suggest he trusts Harry and his doctor. If he is prepared to bet on comp. Once he bet on comp, the nature of the ultimate consituants of what do the computation, relatively to its usual
environments, does not matter.

OK?

Bruno



http://iridia.ulb.ac.be/~marchal/




--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to everything- l...@googlegroups.com.
To unsubscribe from this group, send email to
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en.


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com . For more options, visit this group at http://groups.google.com/group/everything-list?hl=en .


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to