On Feb 2, 7:07 am, Jack Mallah <jackmal...@yahoo.com> wrote:
> --- On Wed, 1/27/10, Brent Meeker <meeke...@dslextreme.com> wrote:
>
> > Jack is talking about copies in the common sense of initially physically 
> > identical beings who however occupy different places in the same spacetime 
> > and hence have different viewpoints and experiences.
>
> No, that's incorrect.  I don't know where you got that idea but I'd best put 
> that misconception to rest first.
>
> When I talk about copies I mean the same thing as the others on this list - 
> beings who not only start out as the same type but also receive the same type 
> of inputs and follow the same type of sequence of events.  Note: They follow 
> the same sequence because they use the same algorithm but they must operate 
> independently and in parallel - there are no causal links to enforce it.  If 
> there are causal links forcing them to be in lockstep I might say they are 
> shadows, not copies.
>
> Such copies each have their own, separate consciousness - it just happens to 
> be of the same type as that of the others.  It is not "redundancy" in the 
> sense of needless redundancy.  Killing one would end that consciousness, 
> yes.  In philosophy jargon, they are of the same type but are different 
> tokens of it.
>
> --- On Thu, 1/28/10, Jason Resch <jasonre...@gmail.com> wrote:
>
> > Total utilitarianism advocates measuring the utility of a population based 
> > on the total utility of its members.
> > Average utilitarianism, on the other hand, advocates measuring the utility 
> > of a population based on the average utility of that population.
>
> I basically endorse total utilitarianism.  (I'm actually a bit more 
> conservative but that isn't relevant here.)  I would say that average 
> utilitarianism is completely insane and evil.  Ending the existence of a 
> suffering person can be positive, but only if the quality of life of that 
> person is negative.  Such a person would probably want to die.  OTOH not 
> everyone who wants to die has negative utility, even if they think they do.
>
> --- On Wed, 1/27/10, Stathis Papaioannou <stath...@gmail.com> wrote:
>
> > if there were a million copies of me in lockstep and all but one were 
> > destroyed, then each of the million copies would feel that they had 
> > continuity of consciousness with the remaining one, so they are OK with 
> > what is about to happen.
>
> Suppose someone killed all copies but lied to them first, saying that they 
> would survive.  They would not feel worried.  Would that be OK?  It seems 
> like the same idea to me.
>
> > Your measure-preserving criterion for determining when it's OK to kill a 
> > person is just something you have made up because you think it sounds 
> > reasonable, and has nothing to do with the wishes and feelings of the 
> > person getting killed.
>
> First, I should reiterate something I have already said: It is not generally 
> OK to kill someone without their permission even if you replace them.  The 
> reason it's not OK is just that it's like enslaving someone - you are forcing 
> things for them.  This has nothing particularly to do with killing; the same 
> would apply, for example, to cutting off someone's arm and replacing it with 
> a new one.  Even if the new one works fine, the guy has a right to be mad if 
> his permission was not asked for this.  That is an ethical issue.  I would 
> make an exception for a criminal or bad guy who I would want to imprison or 
> kill without his permission.
>
> That said, as my example of lying to the person shows, Stathis, your 
> criterion of caring about whether the person to be killed 'feels worried' is 
> irrelevant to the topic at hand.
>
> Measure preservation means that you are leaving behind the same number of 
> people you started with.  There is nothing arbitrary about that.  If, even 
> having obtained Bob's permission, you kill Bob, I'd say you deserve to be 
> punished if I think Bob had value.  But if you also replace him with Charlie, 
> then if I judge that Bob and Charlie are of equal value, I'd say you deserve 
> to be punished and rewarded by the same amount.  The same goes if you kill 
> Bob and Dave and replace them with Bob' and Dave', or if you kill 2 Bobs and 
> replace them with 2 other Bobs.  That is measure preservation.  If you kill 2 
> Bobs and replace them with only one then you deserve a net punishment.
>
> > > Suppose there is a guy who is kind of a crazy oriental monk.  He 
> > > meditates and subjectively believes that he is now the reincarnation of 
> > > ALL other people.  Is it OK to now kill all other people and just leave 
> > > alive this one monk?
>
> > No, because the people who are killed won't feel that they have continuity 
> > of consciousness with the monk, unless the monk really did run emulations 
> > of all of them in his mind.
>
> They don't know what's in his mind either way, so what they believe before 
> being killed is utterly irrelevant here.  We can suppose for arguments' sake 
> that they are all good peasants, they never miss giving their rice offerings, 
> and so they believe anything the monk tells them.  And he believes what he 
> says.
>
> Perhaps what you were trying to get at is that _after_they are killed, it 
> will be OK if they really do find themselves reincarnated in the monk.  But 
> who decides if that occurred or not?  The monk thinks it did; that criterion 
> would make his belief self-consistent.  Nor can you require the number of 
> people to be conserved - we know fission (as when learning the result of a QM 
> experiment) and fusion would be possible.  Nor can you use the criterion of 
> memory, unless you are prepared to say that memory loss changes one person 
> into a different person.  If so you will die when you forget where you put 
> your car keys. 
>
> The reality is, there is no non-arbitrary criterion for personal identity 
> over time.
>
> Personal identity, being a matter of arbitrary definition, can have no 
> relevance to what is observable.   What matters is the measure distribution.
>
> > The fact of the matter is that we are *not* the same physical being 
> > throughout our lives. The matter in our bodies, including our brains, turns 
> > over as a result of metabolic processes so that after a few months our 
> > physical makeup is almost completely different. It is just a contingent 
> > fact of our psychology that we consider ourselves to be the same person 
> > persisting through time. You could call it a delusion. I recognise that it 
> > is a delusion, but because my evolutionary program is so strong I can't 
> > shake it, nor do I want to. I want the delusion to continue in the same way 
> > it always has: the new version of me remembers being the old version of me 
> > and anticipates becoming the even newer version of me.
>
> If you really acknowledge that it is a delusion, that is good progress.  But 
> you are wrong that you can't shake it.  In fact, if you admit it is a 
> delusion then you admit that it is false and you have already shaken it.
>
> Of course, my utility function remains strongly peaked in favor of people 
> very similar to my current self so that in practical terms I behave normally.
>
> > And it wouldn't matter to me if more copies of me were destroyed than 
> > reconstituted or allowed to live, since each of the copies would continue 
> > to have the delusional belief that his consciousness will continue in the 
> > sole survivor.
>
> That is the heart of the matter.  Such a delusion is both false and 
> dangerous.  My task is to convince you, and others like you, that while your 
> current consciousness will not continue per se no matter what, the 
> consciousness of your future selves has value.  And the more of them there 
> are (in terms of measure) the more value, because they do not share a single 
> consciousness even if they are all of the same type.
>
> --- On Tue, 1/26/10, Nick Prince <m...@dtech.fsnet.co.uk> wrote:
>
> > It seems that at the root of things you are arguing that if you have one 
> > person up until time t and can make a so called identical copy at that time 
> > (or another), then whether the original is killed or not, if the new copy 
> > is instantiated then it would "feel" and therefore think itself to be the 
> > person it was (because of memories) but that would be illusion (Bit like 
> > the droids in blade runner).
>
> Nick, you too seem to be missing the larger point that personal identity is 
> not fundamental.  Assuming they both live, the new copy has as much claim to 
> be the original as the future version of the original body.  I could equally 
> say that _neither_ of these future people are the same person as the past 
> original.  It is not a meaningful question.  What IS meaningful is that 
> copying increases the number of consciousnesses, while killing decreases it.
>
> > Forgetting about MWI for now and just thinking about why I feel some 
> > continuity in my subjective experience.  I feel that it must have something 
> > to do with the fact that me at time t+dt is an (almost) identical copy to 
> > the me at time t.  If I deny this then I could accept there was no TRANSFER 
> > of consciousness between copies.  Yet my experience makes me feel that 
> > there is?
>
> Your brain gives rise to consciousness at time t1.  It also give rise to 
> consciousness at time t2.  Was there any transfer of consciousness from time 
> t1 to t2?  No, because whether your brain gives rise to consciousness at time 
> t depends only on the situation at time t.  If time t1 never existed and your 
> brain sprang into existence just before t2, in the same state as t2 otherwise 
> has, there would be no difference at t2.
>
> Of course, there is transfer of information from t1 to t2 due to the laws of 
> physics, even if your brain went unconscious in the meantime; this is 
> causality.  When we talk about different copies, even this is completely out 
> of the picture; there is no information transfer between copies.
>
> This is a key point so I'll try to illustrate it with a diagram:
>
> 1 ---------------------------------------------------
>
> 2 ---------------------------------------------------
>
> Say this represents 2 copies of Bob, with forward time being to the right.  
> Each "-" is an observer-moment (OM).  The copies remain identical until they 
> suddenly terminate, receiving the same kind of inputs and so on.  They evolve 
> independently though; perhaps they
>
> read more »...



NP
OK thanks Jack - this is interesting.  Perhaps what I should have said
in the paragraph above was the same but with the word CONTINUATION
rather than TRANSFER so it would read:

“>> Forgetting about MWI for now and just thinking about why I feel
some continuity in my subjective experience.  I feel that it must have
something to do with the fact that me at time t+dt is an (almost)
identical copy to the me at time t.  If I deny this then I could
accept there was no CONTINUATION of consciousness between copies.  Yet
my experience makes me feel that there is?”

So what I am saying here is that there is the feeling that
consciousness has continuity and a *good* copy would think it had
continuity with the original version – that being the reason why I
feel like I am the same or similar person today as I was yesterday.
(I am also bearing in mind here the copying idea that Bruno uses in
his Washington /Moscow thought experiment).
I take your point about the TRANSFER of consciousness because that
does beg the question as to the nature of the stuff that consciousness
is etc and yes you would have to possibly account for a greater amount
of “fluid” for want of an analogy flowing down the second branch you
drew. This transfer would have problems because it would have to be
able to span large spacetime intervals (non locality and all that
etc.).
However I am not convinced that there would be no feeling of
continuity of consciousness between copy and original just as there is
between the me at time t and the one at time t+dt.  An appropriate
copying device (if one were possible) would also be expected to copy
enough information to ensure that the laws of physics (which I presume
are covariant) acting on the instantiated object would enable this
continuity by design.  Unless you are saying that such copying devices
could never be made because the human brain substrate is special in
some kind of way.  I cannot see that we are in any way that special
and I cannot think of mind as being the consequence of something
unphysical.

Best wishes
Nick




-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to