Le 26-janv.-10, à 22:29, Jack Mallah a écrit :
--- On Tue, 1/26/10, Bruno Marchal <marc...@ulb.ac.be> wrote:
On 25 Jan 2010, at 23:16, Jack Mallah wrote:
Killing one man is not OK just because he has a brother.
In our context, the 'brother' has the same consciousness.
The "brother" most certainly does not have "the same" consciousness.
If he did, then killing him would not change the total _amount_ of
consciousness; measure would be conserved. What the brother does have
is his own, different in terms of who experiences it, but
qualitatively identical consciousness.
But then my consciousness here and now is different from my
consciousness after a short interval, and is no more something related
to "me" in any sense of "me" acceptable with the comp assumption. It
seems that this make my consciousness here and now infinitely
implausible in the absolute measure (if this makes sense).
I would also not say yes to a computationalist doctor, because my
consciousness will be related to the diameter of the simulated neurons,
or to the redundancy of the gates, etc. (and this despite the behavior
remains unaffected). This entails also the existence of zombie. If the
neurons are very thin , my "absolute" measure can be made quasi null,
despite my behavior remains again non affected.
From this I conclude you would say "no" to the doctor. All right? The
doctor certainly "kill a 'brother' ".
As you should know by now Bruno, if you are now talking about a
teleportation experiment, in that case you kill one guy (bad) but
create another, qualitatively identical guy (good). So the net effect
is OK. Of course the doctor should get the guy's permission before
doing anything, if he can.
Certainly. But if you mean by this that you say "yes", it is only as a
form of altruism. *you* and *your* consciousness die in the process.
The net effect is OK, but only for your mother, friends, or any third
person observer. Comp, as I use the term is that it is OK in the usual
sense of surviving a clinical operation.
BTW, it may seem that I advocate increased population - that is, if we
had a cloning device, we should use it. In general, yes, but a planet
has a limited capacity to support a high population over a long term,
which we may have already exceeded. Too much at once will result in a
lower total population over time due to a ruined environment as well
as lower quality of life. So in practice, it would cause problems.
But if we has a second planet available and the question is should we
populate it, I'd say yes.
Apparently we agree on what we disagree. You position is not
computationalism, where identity does not depend on the implementation
of the program, and two computers running the same program can be seen
as a special implementation of one program, like in spacecraft.
--- On Mon, 1/25/10, Stathis Papaioannou <stath...@gmail.com> wrote:
Killing a man is bad because he doesn't want to be killed,
Actually that's not why - but let that pass for now.
and he doesn't want to be killed because he believes that act would
cause his stream of consciousness to end. However, if he were killed
and his stream of consciousness continued, that would not be a
problem provided that the manner of death was not painful. Backing up
his mind, killing him and then making an exact copy of the man at the
moment before death is an example of this process.
See above. That would be a measure-conserving process, so it would be
But the measure will depend on the implementation type, with single or
doubled neurons, etc. And this is not relevant if we are digital
It is just a matter of definition whether it is the same guy or a
different guy. Because now we have one guy at a time, it is convenient
to call them the same guy. If we had two at once, we could call them
the same if we like, but the fact would remain that they would have
different (even if qualitatively the same) consciousnesses, so it is
better to call them different guys.
They would have different consciousness (but qualitatively identical)
only if they are genuinely different guy, so it could not be matter of
definition. But then what is *a* guy?
Making two copies running in lockstep and killing one of them is
equivalent to this: the one that is killed feels that his stream of
consciousness continues in the one that is not killed. It is true
that in the second case the number of living copies of the person has
halved, but from the point of view of each copy it is exactly the
same as the first case, where there is only ever one copy extant.
That one that is killed doesn't feel anything after he is killed. The
one that lives experiences whatever he would have experienced anyway.
There is NO TRANSFER of consciousness. Killing a guy (assuming he is
not an evil guy or in great pain) and not creating a new guy to
replace him is always a net loss.
The general point is that what matters to the person is not the
objective physical events, but the subjective effect that the
objective physical events will have.
What matters is the objective reality that includes all subjective
In what sense does an objective reality include a subjective
experience? This is a highly ambiguous way to talk. It could entail
confusion of level of description, and perspective.
Suppose there is a guy who is kind of a crazy oriental monk. He
meditates and subjectively believes that he is now the reincarnation
of ALL other people. Is it OK to now kill all other people and just
leave alive this one monk?
If his belief is true, so that he is really the incarnation of all
other people (which is not plausible), it would be OK to kill all other
people, nobody would die in that process (assuming he reincarnates the
others in the state just before they are killed(*)).
If his belief is false, and that it does not incarnate the other people
in the appropriate way, then it would be wrong to kill any among the
I would say that with both comp and Everett-QM, each time you make a
decision, for example of drinking a cup of tea instead of drinking a
cup of coffee, you do kill the "you" who would have develop a life with
the memory of having drink a cup of coffee. With computationalism
personal decision is a form of self-killing.
This does not make any sense with absolute measure. The point is that
absolute measure does not make sense with computationalism.
(*) if that was not added, we could kill all life on the planet, except
one amoeba (and bacteria for its feeding). Once a person duplicates and
get different memories, we have two persons. Except enlightened
persons who stop to identify themselves with their memories (but that
is more an altered consciousness state, and nobody (notably the doctor)
is supposed to believes this is possible).
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to
For more options, visit this group at