RE: More is Better (was RE: another puzzle)

2005-07-02 Thread Stathis Papaioannou

Lee Corbin writes:

[quoting Stathis]

 I think that if it is given that either you or your duplicate
 must die, then you should willingly sacrifice yourself if it will
 enrich your duplicate.
 
 Either way, I think you wake up the next morning very satisfied
 with the outcome.

 How do you wake up the next morning if you're the one who died? Unless 
you

 can effect some sort of mind merge just before dying, you lose all the
 experiences that you have had since you and your duplicate diverged...

Well, Stathis, for heaven's sake!  You've already admitted that a
little memory loss does not threaten your identity!  Recall the Aussies
you wrote about who customarily lose an entire evening's inebriation :-)


Yes, and I also admitted that there is an inconsistency in my position. 
Having my duplicate who has already diverged live on while I die is not just 
memory loss, but rather replacement of the lost memories with someone 
else's, which I feel is a greater threat to my identity and which I would be 
less likely to agree to. Memory loss would be more like having myself backed 
up and the backup run after I have died. If the backups are frequent, I 
suppose it is better than no backup at all, but I would still feel afraid of 
dying. At its most basic, for me anyway, the fear of imminent death is the 
fear that the person I am *now* will be wiped from the universe and never 
have any more experiences. The same consideration ought to apply to memory 
loss, but people don't generally think of it that way, because they know 
that they'll be OK afterwards, on the basis of past experience.



So remember: your duplicate in the next room is *exactly* in the
same state you are in right now if you lose a little recent memory,
and then have some new experiences that are identical to his over
the last few minutes. So he *is* you!  That's what I mean when I
say that we must regard duplicates as selves.



And let's go back to this crazy transfer that seems (from my
viewpoint) to occupy the attention of those who believe in
continuers.  So you expect to be the person who arrives at
the other end if you are disintegrated here and teleported
there. You even expect to be him if the original here is
not disintegrated.  (Here I must lash out at the bizarre
probability calculus that ensues for many at this point:
whether or not they *are* the remote version seems to depend
on what happens locally here. Sheese.)


There is a transfer of information in order to effect the teleportation, but 
this is just a technical detail; there is no actual transfer of identity, 
if that is what you meant. There are only people's beliefs and memories 
concerning who they are and who they were, coupled with the fact that human 
minds can only experience being one person at a time. As for how what 
happens here can affect whether or not the person entering the teleporter 
finds himself over there: if you destructively teleport an apple there, 
there is a 100% chance that the apple is over there; whereas if you 
non-destructively teleport an apple there, choosing an apple at random 
because you are only allowed one apple at a time, there is a 50% chance it 
will be the one here and a 50% chance it will be the one there. Where's the 
problem?



So if all the 1000 Stathis's in the various rooms are to die
but one, then that one continues all the others. Now we
play a trick. The 1000 don't actually die but are placed in
instantaneous suspended animation.

Oops!  The big Bean-Counter Upstairs who keeps track of where
the serial numbers go is confused!  He had better have all
1000 Stathis's continue in the one, just in case something
goes wrong with the suspended animation machinery.  But then...
what to do when nothing goes wrong?  How to send all the souls
back into the original bodies???  (Of course, here in this paragraph
I am just attempting to ridicule a point of view in which I do not
believe. The truth is, of course, is that no transfers take place,
and the whole idea of a continuer is wrong.)

If you believe that the 1000 will continue in the one (what, I
wonder, with probability 1000?), then they'll continue in the
one whether or not they're disintegrated.


Sorry, I don't understand what you're saying here. Do you mean that 1000 
versions of me are running in parallel and all but one are stopped or 
suspended? It's obvious to me that however often the number running is 
changed, I won't be able to tell that there is any difference. This wouldn't 
work if they had serial numbers because if each version knew his number, 
they would start to diverge, and stopping one of them would then lead to the 
loss of unique experience, as discussed above (you might say it doesn't 
matter if it's something as trivial as a serial number, like losing a second 
of memory, but the point still stands). It wouldn't work if they had souls 
either, because killing some of them, even if they remained running in 
parallel, would send the souls to heaven or hell 

Re: More is Better (was RE: another puzzle)

2005-07-01 Thread Eugen Leitl
On Thu, Jun 30, 2005 at 04:25:09PM -0700, Lee Corbin wrote:

  I've sometimes wondered whether some anaesthetics might work this way: put
  you into a state of paralysis, and affect your short term memory. So you
  actually experience the doctor cutting you open, with all the concommitant
  pain, but you can't report it at the time and forget about it afterwards. If
  you knew an anaesthetic worked that way, would you agree to have it used on
  you for surgery?

Midazolam (Dormicum) has this property, and is routinely used in anaesthesia
for that purpose (patient partially wakes up during surgery, has an unpleasant
experience, the drug is administered to erase short time memory (mostly)).

Many other drugs (some antibiotics, also alcohol) also have this property.

Speaking of alcohol: anyone who considers that consciousness is a boolean
property is very welcome to a personal experiment involving measuring 
correlation of the degree of awareness with alcohol content in blood, 
titrating until loss of consciousness.

 When I was in high school, I read that dentists were considering
 use of a new anasthetic with this property. I was revolted, and
 even more revolted when none of my friends could see anything
 wrong with it.

I understand such drugs are currently considered for an early therapy for 
traumatic incidents (if you can't remember it, you won't be traumatized by
recurring memories).

 Experiences are real, whether you remember them or not.

-- 
Eugen* Leitl a href=http://leitl.org;leitl/a
__
ICBM: 48.07100, 11.36820http://www.leitl.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE


signature.asc
Description: Digital signature


Re: More is Better (was RE: another puzzle)

2005-07-01 Thread Eugen Leitl
On Thu, Jun 30, 2005 at 07:07:35PM -0700, Jonathan Colvin wrote:

 I'm sure they are. Awareness with no memory would be complete confusion
 (you'd have no idea what any of your sense qualia refer to; or of much else,

E.g. severe Wernicke-Korsakoff syndrome patients have no short term memory.

-- 
Eugen* Leitl a href=http://leitl.org;leitl/a
__
ICBM: 48.07100, 11.36820http://www.leitl.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE


signature.asc
Description: Digital signature


RE: More is Better (was RE: another puzzle)

2005-07-01 Thread Stathis Papaioannou

Lee Corbin writes:

[quoting Stathis]
 These are not trivial questions. The basic problem is that our minds 
have
 evolved in a world where there is no copying and no memory loss (memory 
loss
 may have occurred naturally, of course, but evolution's answer to it 
would
 have been to wipe out the affected individual and their genes), so there 
is

 a mismatch between reason and intuition.

Well, it's time to at least be verbally able to prescribe what
one would do. The flat, linear model suggests that more good
runtime for me is good, less is worse, and bad runtime is worst
of all.

I think that if it is given that either you or your duplicate
must die, then you should willingly sacrifice yourself if it will
enrich your duplicate.

Either way, I think you wake up the next morning very satisfied
with the outcome.


How do you wake up the next morning if you're the one who died? Unless you 
can effect some sort of mind merge just before dying, you lose all the 
experiences that you have had since you and your duplicate diverged, and you 
will never have any more new experiences or knowledge of the world. That's 
the problem with dying!


I still don't really understand why you are so insistent that your duplicate 
is you and should be considered on a par with yourself when it comes to 
deciding what is in your self-interest. You have arrived at this conclusion 
from the fact that you and he were physically and mentally identical at the 
moment of duplication, and will remain more similar than a pair of identical 
twins despite diverging post-duplication. However, I don't see why it is any 
less valid or less rational if I say that I find the idea of having a 
duplicate around disturbing, and would prefer not to be duplicated, 
especially if there is a chance the two of us might meet.


--Stathis Papaioannou

_
Express yourself instantly with MSN Messenger! Download today - it's FREE! 
http://messenger.msn.click-url.com/go/onm00200471ave/direct/01/




RE: More is Better (was RE: another puzzle)

2005-06-30 Thread Stathis Papaioannou

Lee Corbin writes:

[quoting Stathis]

 I believe that even though someone can only
 *experience* being one person at a time, in
 the event of duplication all the copies have
 an equal claim to being continuations of the
 original, and it is in attempting to reconcile
 these two facts that I arrive at the notion of
 subjective probabilities for the next observer
 moment.

It's with the very notion of a continuer that
I've always had a problem. So let me ask you
a little about it. Now clearly, if ten minutes
from now the Earth Stathis is to be killed, but
a Martian duplicate is made five minutes from
now, you---the present Stathis---don't really
have a problem with that (that is, not a problem
that couldn't be fixed with a big bribe).

So next, let me ask about a new situation in
which one hour from now you are to die here,
but a duplicate of you will the same second
be established on Mars, only this duplicate
has a little amnesia, and doesn't remember
the last half-hour of its life. Would that
be a continuer of you?  Would it be a
continuer of you+60_minutes from now?
Will it only be a continuer of you+30_minutes
from now? Is it a continuer of the you-now?

How about this? For ten million dollars, would
you agree to have the last ten minutes of your
memory erased, where you are now?


These are all interesting questions that have bothered me for a long time. I 
think the most useful suggestion I can make about how to decide whether 
other versions of a person are or aren't continuers or the same person 
is to avoid a direct answer at all and ask - as you have done - how much 
memory loss a person would tolerate before they felt they would not be the 
same person. This is something that comes up all the time in clinical 
situations. I would say that definitely I would not want total memory 
erasure at any price, because that would be like dying. On the other hand, 
10 minutes of memory loss for ten million dollars (especially if they were 
US dollars, instead of our prettier and more durable, but less valuable, 
Australian kind) is an offer I would definitely take up; in fact, people pay 
to get drunk on a Friday night and suffer more memory loss than this.


Before you ask, this raises another interesting question: would I agree for 
the same amount of money to be painlessly killed 10 minutes after being 
duplicated? Given that I believe my duplicate provides seamless continuity 
of consciousness from the point of duplication, this should be the same as 
losing 10 minutes of memory. However, I would probably balk at being 
killed if it were happening for the first time, and I might hesitate even 
if I knew that it had happened to me many times before.


Yet another variation: for 10 million dollars, would you agree to undergo a 
week of excruciating pain, and then have the memory of the week wiped? What 
if you remember agreeing to this 100 times in the past; that is, you 
remember agreeing to it, then a moment later experiencing a slight 
discontinuity, and being given the ten million dollars (which let's say you 
gambled all away). You were told every time you would experience pain, but 
all you experienced was being given the money. Would it be tempting to agree 
to this again (and this time, I'll put the money in the bank)?


These are not trivial questions. The basic problem is that our minds have 
evolved in a world where there is no copying and no memory loss (memory loss 
may have occurred naturally, of course, but evolution's answer to it would 
have been to wipe out the affected individual and their genes), so there is 
a mismatch between reason and intuition.


--Stathis Papaioannou

_
On the road to retirement? Check out MSN Life Events for advice on how to 
get there! http://lifeevents.msn.com/category.aspx?cid=Retirement




RE: More is Better (was RE: another puzzle)

2005-06-30 Thread Jonathan Colvin
Stathis wrote:

 Yet another variation: for 10 million dollars, would you 
 agree to undergo a week of excruciating pain, and then have 
 the memory of the week wiped? What if you remember agreeing 
 to this 100 times in the past; that is, you remember agreeing 
 to it, then a moment later experiencing a slight 
 discontinuity, and being given the ten million dollars (which 
 let's say you gambled all away). You were told every time you 
 would experience pain, but all you experienced was being 
 given the money. Would it be tempting to agree to this again 
 (and this time, I'll put the money in the bank)?

I've sometimes wondered whether some anaesthetics might work this way: put
you into a state of paralysis, and affect your short term memory. So you
actually experience the doctor cutting you open, with all the concommitant
pain, but you can't report it at the time and forget about it afterwards. If
you knew an anaesthetic worked that way, would you agree to have it used on
you for surgery?

Jonathan Colvin



Re: More is Better (was RE: another puzzle)

2005-06-30 Thread Johnathan Corgan

Jonathan Colvin wrote:


I've sometimes wondered whether some anaesthetics might work this way: put
you into a state of paralysis, and affect your short term memory. So you
actually experience the doctor cutting you open, with all the concommitant
pain, but you can't report it at the time and forget about it afterwards. If
you knew an anaesthetic worked that way, would you agree to have it used on
you for surgery?


Here is a similar situation.

I had a medical procedure performed using something called conscious 
sedation.  In this technique, a drug was administered (Versed in my 
case) which allowed me to retain consciousness and even engage my doctor 
in conversation.  Yet no long term memories were laid down.


This temporary anterograde amnesia is the same experience as above, 
except I wasn't paralyzed and was free to report any experienced pain to 
 my doctor.


In my case, this was a (supposedly) mildly painful procedure, yet I in 
fact have a puzzling gap in my continuity of memory and have no 
recollection of any pain (or of anything else) during that time period. 
 For all I know, I was in agony and had to be in full restraints to 
allow things to proceed--without anyone telling me what happened, I have 
no way to know.


Today I'd do this again without hesitation.  I wish my dentist were 
licensed to do this so the next time I have to have a root canal I can 
have no memory of it afterwards.


(As an aside, Versed is quick to act but slow to recover.  It's very 
difficult to describe the 1st person experience here but I have memories 
of something I can only call gradual awareness that got better over a 
period of a couple hours, yet the nursing staff said I was talking to 
them on and off during this whole period.  Weird.)


-Johnathan



RE: More is Better (was RE: another puzzle)

2005-06-30 Thread Lee Corbin
Stathis writes

 How about this? For ten million dollars, would
 you agree to have the last ten minutes of your
 memory erased, where you are now?
 
 These are all interesting questions that have bothered me for a long time. I 
 think the most useful suggestion I can make about how to decide whether 
 other versions of a person are or aren't continuers or the same person 
 is to avoid a direct answer at all and ask - as you have done - how much 
 memory loss a person would tolerate before they felt they would not be the 
 same person.

Yes, it's a key question. All we can assert for sure is
that gradually between 0% and 100% it pays less and less
to think that you're the same person. The same answer
obtains if we measure in terms of how much recent memory
could be lost: on this, I happen to feel that I am about
fifty percent the same person at 18 that I am now.

 10 minutes of memory loss for ten million dollars is an offer
 I would definitely take up; in fact, people pay 
 to get drunk on a Friday night and suffer more memory loss than this.

Right.

 Before you ask, this raises another interesting question: would I agree for 
 the same amount of money to be painlessly killed 10 minutes after being 
 duplicated? Given that I believe my duplicate provides seamless continuity 
 of consciousness from the point of duplication, this should be the same as 
 losing 10 minutes of memory. However, I would probably balk at being 
 killed if it were happening for the first time, and I might hesitate even 
 if I knew that it had happened to me many times before.

Well, the biggest point of philosophy for me is that it be
prescriptive.  Suppose that you have to figure out all this
ahead of time---or would you prefer just to go with a gut
instinct when the time comes?

 Yet another variation: for 10 million dollars, would you agree to undergo a 
 week of excruciating pain, and then have the memory of the week wiped?

No.

 What if you remember agreeing to this 100 times in the past; that is, you 
 remember agreeing to it, then a moment later experiencing a slight 
 discontinuity, and being given the ten million dollars (which let's say you 
 gambled all away).

This is a horrific situation that I would put a stop to
if I could. One of my old thought experiments was that
you start noticing a $1000 increase in your bank account
every day.  Then after a month or so, you learn that you
are every night being awakened and tortured for an hour,
and then the memory is erased.

My way of looking at it provides a clear answer: you are
your duplicates, and so just because you don't remember
something doesn't mean that it didn't happen (or isn't
happening) to you.

 These are not trivial questions. The basic problem is that our minds have 
 evolved in a world where there is no copying and no memory loss (memory loss 
 may have occurred naturally, of course, but evolution's answer to it would 
 have been to wipe out the affected individual and their genes), so there is 
 a mismatch between reason and intuition.

Well, it's time to at least be verbally able to prescribe what
one would do. The flat, linear model suggests that more good
runtime for me is good, less is worse, and bad runtime is worst
of all.

I think that if it is given that either you or your duplicate
must die, then you should willingly sacrifice yourself if it will
enrich your duplicate.

Either way, I think you wake up the next morning very satisfied
with the outcome.

Lee



Re: More is Better (was RE: another puzzle)

2005-06-30 Thread Russell Standish
This leads to a speculation that memories are an essential requirement
for consciousness...

On Thu, Jun 30, 2005 at 02:08:42PM -0700, Johnathan Corgan wrote:
...

 
 (As an aside, Versed is quick to act but slow to recover.  It's very 
 difficult to describe the 1st person experience here but I have memories 
 of something I can only call gradual awareness that got better over a 
 period of a couple hours, yet the nursing staff said I was talking to 
 them on and off during this whole period.  Weird.)
 
 -Johnathan

-- 
*PS: A number of people ask me about the attachment to my email, which
is of type application/pgp-signature. Don't worry, it is not a
virus. It is an electronic signature, that may be used to verify this
email came from me if you have PGP or GPG installed. Otherwise, you
may safely ignore this attachment.


A/Prof Russell Standish  Phone 8308 3119 (mobile)
Mathematics0425 253119 ()
UNSW SYDNEY 2052 [EMAIL PROTECTED] 
Australiahttp://parallel.hpc.unsw.edu.au/rks
International prefix  +612, Interstate prefix 02



pgpffIU6NBJl2.pgp
Description: PGP signature


Re: More is Better (was RE: another puzzle)

2005-06-30 Thread Johnathan Corgan

Lee Corbin wrote:


When I was in high school, I read that dentists were considering
use of a new anasthetic with this property. I was revolted, and
even more revolted when none of my friends could see anything
wrong with it.

Experiences are real, whether you remember them or not.


It's interesting how different people react to things.  I've actually 
been through this (see previous post); it's not theoretical for me.  And 
I would do it again, and wish my dentist could use this technique.


(Of course, in my case, is was for a semi-surgical procedure that I 
could probably have withstood with conscious sedation; I don't think I'd 
choose this for open heart surgery!)


Here is a case where I voluntarily chose to undergo a mildly painful 
experience with the foreknowledge that I would have no recall of it.  I 
am none the worse for it.  Did I experience pain?  Yes, so I am told. 
 Was that experience real?  Sure.  Can I relive that experience in my 
memory?  Not a chance.  And that's how I wanted it.  What is so 
revolting about it?


What's behind the strong emotion here?  (You seem to have had a similar 
reaction to the events depicted in Brin's Kiln People.)


-Johnathan



Re: More is Better (was RE: another puzzle)

2005-06-30 Thread Johnathan Corgan

Johnathan Corgan wrote:
(Of course, in my case, is was for a semi-surgical procedure that I 
could probably have withstood with conscious sedation; I don't think I'd 

   ^^ without
-Johnathan



Re: More is Better (was RE: another puzzle)

2005-06-30 Thread Johnathan Corgan

Russell Standish wrote:


This leads to a speculation that memories are an essential requirement
for consciousness...


I agree.  Had I known then what I know now, I would have asked the 
nursing staff and doctor to question me in detail about my first person 
experience *while it was happening*, since all I can think about now is 
how I felt before and after.


Was I oriented to time, place, who I was, and what was happening to me?

Did my first person experience of consciousness seem any different? 
(Aside from the obvious mellowness that any sedative induces.)


While I was undergoing the procedure, and feeling the pain, did I regret 
the decision to be awake but not remember later?


Knowing that I would forget this, is there anything about what I was 
experiencing that I'd want to be noted so I could read about it afterward?


etc.

So I do wonder, if I was awake and responding accurately to verbal 
cues, but not laying down memories, was I really conscious?  Of 
course, it *seems* to me now that I was unconscious the whole time, with 
some odd emergent effects as the Versed wore off.  But as I've 
gathered from reading folks like Dennett, what things seem like and what 
actually is happening can be very different things.


Performing the question  answer session described above is at least 
part of my willingness to undergo conscious sedation again.


-Johnathan







RE: More is Better (was RE: another puzzle)

2005-06-30 Thread Stathis Papaioannou

Jonathan Cilvin writes:


 Yet another variation: for 10 million dollars, would you
 agree to undergo a week of excruciating pain, and then have
 the memory of the week wiped? What if you remember agreeing
 to this 100 times in the past; that is, you remember agreeing
 to it, then a moment later experiencing a slight
 discontinuity, and being given the ten million dollars (which
 let's say you gambled all away). You were told every time you
 would experience pain, but all you experienced was being
 given the money. Would it be tempting to agree to this again
 (and this time, I'll put the money in the bank)?

I've sometimes wondered whether some anaesthetics might work this way: put
you into a state of paralysis, and affect your short term memory. So you
actually experience the doctor cutting you open, with all the concommitant
pain, but you can't report it at the time and forget about it afterwards. 
If

you knew an anaesthetic worked that way, would you agree to have it used on
you for surgery?


I've thought about exactly this while sitting for hours as the assistant 
anaesthetist during long operations! In fact, we know that in some cases it 
is exactly what happens. If the anaesthetic is underdone and the patient 
starts moving when the surgeon starts cutting, it is possible that the 
patient will remember being in terrible pain after he wakes up, and sue the 
anaesthetist. Therefore, if an incident like this occurs, the anaesthetist 
gives the patient a bolus of IV midazolam, which usually ensures that the 
patient has no memory of the incident, and everyone is happy. I wonder what 
a court would say if the patient somehow found out what had happened and 
decided to sue anyway, arguing that although he couldn't remember it, he 
must have been in excruciating pain, and therefore deserves compensation for 
the suffering caused by the doctor's negligence?


--Stathis Papaioannou

_
REALESTATE: biggest buy/rent/share listings   
http://ninemsn.realestate.com.au




RE: More is Better (was RE: another puzzle)

2005-06-30 Thread Jonathan Colvin
Russell Standish wrote:

 This leads to a speculation that memories are an essential 
 requirement for consciousness...

I'm sure they are. Awareness with no memory would be complete confusion
(you'd have no idea what any of your sense qualia refer to; or of much else,
either). That's why consciousness is *not* a binary phenomenon. As babies
grow and gain memories and knowledge, they *gradually* become conscious.
This is one reason ethicist Peter Singer ascribes a lower intrinic
person-ness to infants and the mentally retarded as compared to competant
adults.

Jonathan Colvin



Re: More is Better (was RE: another puzzle)

2005-06-30 Thread Russell Standish
Yes, but you would still have to give meaning to half-conscious (or
1/10th conscious any other such real number). What about double
conscious? Or is consciousness bounded by a given number (eg 1).

I do not know how to give meaning to these questions. 

I do appreciate the analogy with the question What is the first
mammal. Mammalness involves a range of phenotypic features that
separates mammals from reptiles, yet undoutedly there were interim
forms that exhibited some but not all features of mammals. Were these
animals mammals or reptiles? Perhaps the question is meaningless, and
that a different classification is needed.

If that is the case, then the question of whether babies are conscious
or not may not actually be meaningful. They exhibit certain
characteristics at some points, whereas others kick in later. The
ability to keep track of external objects even when hidden from view
happens pre-lingually, for example (babies are surprised when an
object disappearing behind a screen is not there when the screen is
removed, for example), whereas self-awareness is apparently not
present until 18 months or so. Long term memories are not laid down
til much later, but even then, how long is long term. My son at age
2.75 could remember events that happened a year earlier, at age
1.75. Yet now (at age 7) he has trouble remembering things that
happened at age 4.

Cheers

On Thu, Jun 30, 2005 at 07:07:35PM -0700, Jonathan Colvin wrote:
 Russell Standish wrote:
 
  This leads to a speculation that memories are an essential 
  requirement for consciousness...
 
 I'm sure they are. Awareness with no memory would be complete confusion
 (you'd have no idea what any of your sense qualia refer to; or of much else,
 either). That's why consciousness is *not* a binary phenomenon. As babies
 grow and gain memories and knowledge, they *gradually* become conscious.
 This is one reason ethicist Peter Singer ascribes a lower intrinic
 person-ness to infants and the mentally retarded as compared to competant
 adults.
 
 Jonathan Colvin

-- 
*PS: A number of people ask me about the attachment to my email, which
is of type application/pgp-signature. Don't worry, it is not a
virus. It is an electronic signature, that may be used to verify this
email came from me if you have PGP or GPG installed. Otherwise, you
may safely ignore this attachment.


A/Prof Russell Standish  Phone 8308 3119 (mobile)
Mathematics0425 253119 ()
UNSW SYDNEY 2052 [EMAIL PROTECTED] 
Australiahttp://parallel.hpc.unsw.edu.au/rks
International prefix  +612, Interstate prefix 02



pgptIUookyvsp.pgp
Description: PGP signature


RE: More is Better (was RE: another puzzle)

2005-06-30 Thread Lee Corbin
Johnathan writes

 Lee Corbin wrote:
 
  When I was in high school, I read that dentists were considering
  use of a new anesthetic with this property. I was revolted, and
  even more revolted when none of my friends could see anything
  wrong with it.
  
  Experiences are real, whether you remember them or not.
 
 It's interesting how different people react to things.  I've actually 
 been through this (see previous post); it's not theoretical for me.  And 
 I would do it again, and wish my dentist could use this technique.

I thought that you said that the pain wasn't all that terrible.
Moreover---and this is key---were there or were there not better
anesthetics available?

 (Of course, in my case, is was for a semi-surgical procedure that I 
 could probably have withstood with conscious sedation; I don't think I'd 
 choose this for open heart surgery!)
 
 Here is a case where I voluntarily chose to undergo a mildly painful 
 experience with the foreknowledge that I would have no recall of it.  I 
 am none the worse for it.  Did I experience pain?  Yes, so I am told. 
   Was that experience real?  Sure.  Can I relive that experience in my 
 memory?  Not a chance.  And that's how I wanted it.  What is so 
 revolting about it?
 
 What's behind the strong emotion here?  (You seem to have had a similar 
 reaction to the events depicted in Brin's Kiln People.)

The strong emotion is the mistaken idea that can come to people
that there is a free lunch here: namely, that anything goes so
long as it's not remembered. Where will this stop?  Can we at
once begin the horrific and gruesome experiments in the penal
institutions?  (After all, I am sure that a number of scientists
are truly interested in pain, and so if we conveniently remove
the moral element here---since the inmates don't remember the
pain it doesn't count---they can go for broke.)

So would it be worth it to you to be awakened every night at 3am
and hideously tortured for an hour provided (1) you never remember
it the next day (2) there is no adverse ill effect (say, lack of
sleep), and (3) you are very well paid for it, say $1000 per day?

If anyone wants to go for that, what then if subjectively the 
hour of torture is made to be a century?  So that during each
nightly century you realize over the decades and decades that
the fool that you are during the day has made and is making a
gross mistake. For you come to realize after the first few
years of some particular night that your real life is a life
of total pain and agony; that the mere days which interrupt
the centuries (the days that your day self can remember) are
relatively meaningless interludes, as nothing compared to
each night's torment.

Or, on the other hand, since by the next morning you don't
remember it, what difference did it make

Lee



RE: More is Better (was RE: another puzzle)

2005-06-28 Thread Lee Corbin
Stathis writes

 This brings up an interesting conundrum that I raised three or four torture 
 experiments ago. Given 10 instantiations of a person having an unpleasant 
 experience E ... for example 10 sentient programs running in parallel, is
 it better, if we aim to reduce suffering, to (a) terminate 9 of the 10
 programs and leave one still running and experiencing E, or (b) stop 5 of
 the 10 programs from experiencing E, but leave them running,

Ah ha! And having what sort of experience?  It's crucial: I shall
assume that they are having very mildly positive experiences, e.g.,
that their lives are barely worth living in this condition.

 and leave the other 5 programs continuing to experience E?

I will say that it is better to terminate 9 of the 10,
because we are given that the experience E is horrible:
therefore the total benefit to the person is calculated
as follows:

 5*(-1000) + 5*(2)  1*(-1000)

where, say, E is worth -1000 to you, and the so-so day
worth 2. This uses the additivity of benefit. It also
accords with common sense, in that if you had to sign
up to go through in sequence the experiences of the 10
(five tortures and five so-so days), or instead sign up
to go through just 1 torture, you'd sign up for the latter.
Especially if you were forced to have a run-through of
each ahead of time!

 If you do the total suffering equation assuming that each
 instantiation is separate, (a) is better.

And that is the answer I arrived at.

 But I would argue that if you are one of the suffering victims,
 (a) does you no good at all: subjectively, you will continue to
 suffer, since the one remaining program that is running will 
 serve as continuation for any of the 9 terminated ones.

Here is the dreadful closest continuer method of Nozick and
others. I claim it gives the wrong answer. Look, the continuation
happens anyway, whether you die here or not!  Especially if the
events are outside each other's light cones, how can what happens
here possibly affect what happens there?  Just because you, say,
are *not* terminated here does not mean that you don't continue
there just as much.

What happens in box P far, far away from box Q does not affect what
happens in box Q. If you are in pain in box P, then it doesn't
matter to you in P what goes on in Q. Your benefit seems to me to
be just the sum of the two, much as if you experienced them as a
single copy, but had memory erasure between the experiences.

 In fact, there is no way for someone inside the simulated system
 to know that any of the instantiations had been terminated, as
 long as at least one keeps running. 

Well, even if they are ALL terminated, the subject does not know.
Knowing is an activity, and so you don't know if suddenly you all
die.

 On the other hand, with (b) there is subjectively a 50% probability that 
 your suffering will end.

People who use probability when discussing duplicates seem to talk
as though an executing process had a magical serial number generated
by God. When the subject dies, it's as if the serial number is
instantaneously transferred to *one* other of the possible systems
that could support him, but not to the others. But there are no souls.
There are no serial numbers. You become all the others equally, and
with 100% probability for each. You even become them if you do
not die.

What will happen if you choose A is that you will experience E through
one bad session. If you choose B, you will experience E through five
bad sessions, and E' through five so-so sessions. Given that E' is of
no particular value either positive or negative, A is the better choice.
(This is just using different language to do the same calculation as above.)

Lee



RE: More is Better (was RE: another puzzle)

2005-06-28 Thread Stathis Papaioannou

Lee Corbin writes:

 This brings up an interesting conundrum that I raised three or four 
torture
 experiments ago. Given 10 instantiations of a person having an 
unpleasant
 experience E ... for example 10 sentient programs running in parallel, 
is

 it better, if we aim to reduce suffering, to (a) terminate 9 of the 10
 programs and leave one still running and experiencing E, or (b) stop 5 
of

 the 10 programs from experiencing E, but leave them running,

Ah ha! And having what sort of experience?  It's crucial: I shall
assume that they are having very mildly positive experiences, e.g.,
that their lives are barely worth living in this condition.


Yes, this is what I meant: that given a 49% chance of escaping E and living, 
or a 50% chance of escaping E by dying, dying would be the preferred option.



 and leave the other 5 programs continuing to experience E?

I will say that it is better to terminate 9 of the 10,
because we are given that the experience E is horrible



 But I would argue that if you are one of the suffering victims,
 (a) does you no good at all: subjectively, you will continue to
 suffer, since the one remaining program that is running will
 serve as continuation for any of the 9 terminated ones.

Here is the dreadful closest continuer method of Nozick and
others. I claim it gives the wrong answer. Look, the continuation
happens anyway, whether you die here or not!  Especially if the
events are outside each other's light cones, how can what happens
here possibly affect what happens there?  Just because you, say,
are *not* terminated here does not mean that you don't continue
there just as much.


It is the closest part of Nozick's method that I disagree with, based as 
it is on the assumption that that there can only be one real you - the 
closest continuer - out of multiple possible candidates. I believe that even 
though someone can only *experience* being one person at a time, in the 
event of duplication all the copies have an equal claim to being 
continuations of the original, and it is in attempting to reconcile these 
two facts that I arrive at the notion of subjective probabilities for the 
next observer moment.


As for everything else you say in the above paragraph, I agree completely! I 
had been assuming that the experiment was being done in an isolated system, 
but if you take into account the rest of the multiverse, there is no reason 
why the continuer (or successor OM) at the point where the (a)/(b) decision 
is being made must come from the experiment rather than, say, 10^10^100 
metres away. (Some might argue that there has to be some sort of information 
transfer if the successor OM is to count, but I can't see why this should be 
so.)


The number of successor OM's and the type of experiences they are having is 
important, and it changes the calculations to determine the subjective 
probabilities. Let's assume that the external successor OM's all have a 
bland, average sort of experience, similar to the alternative to E in choice 
(b). If there is one other successor OM, then choice (b) would still be 
better:


(a) Pr(E)=1/2, (b) Pr(E)=5/11

If there were two external successor OM's, choice (a) would now be better:

(a) Pr(E)=1/3, (b) Pr(E)=5/12

If there were more than two external successor OM's, (a) would be an even 
better choice.


Given the existence of the rest of the multiverse, where all sorts of things 
impossible for us here to know, let alone control, may be happening or (more 
importantly) may in future happen to other versions of us, a general 
argument can be made that it *is* helpful to increase our measure as much as 
possible, as a sort of counterbalance to any terrible things that may be 
happening to us elsewhere. It is the relative measure, which determines 
subjective probability of what our next moment will be like, which is 
important rather than absolute measure per se. In most cases it may turn out 
the same whether you look at it your way or my way, but in certain special 
cases, such as the thought experiment above considered as an isolated 
system, there are differences.





 In fact, there is no way for someone inside the simulated system
 to know that any of the instantiations had been terminated, as
 long as at least one keeps running.

Well, even if they are ALL terminated, the subject does not know.
Knowing is an activity, and so you don't know if suddenly you all
die.

 On the other hand, with (b) there is subjectively a 50% probability that
 your suffering will end.

People who use probability when discussing duplicates seem to talk
as though an executing process had a magical serial number generated
by God. When the subject dies, it's as if the serial number is
instantaneously transferred to *one* other of the possible systems
that could support him, but not to the others. But there are no souls.
There are no serial numbers. You become all the others equally, and
with 100% probability for each. You even become them if you 

RE: More is Better (was RE: another puzzle)

2005-06-28 Thread Lee Corbin
Stathis writes

 [Lee wrote]
  Here is the dreadful closest continuer method of Nozick and
  others. I claim it gives the wrong answer. Look, the continuation
  happens anyway, whether you die here or not!  Especially if the
  events are outside each other's light cones, how can what happens
  here possibly affect what happens there?  Just because you, say,
  are *not* terminated here does not mean that you don't continue
  there just as much.
 
 It is the closest part of Nozick's method
 that I disagree with, based as it is on the
 assumption that that there can only be one
 real you - the closest continuer - out of
 multiple possible candidates.

We agree on that!

 I believe that even though someone can only
 *experience* being one person at a time, in
 the event of duplication all the copies have
 an equal claim to being continuations of the
 original, and it is in attempting to reconcile
 these two facts that I arrive at the notion of
 subjective probabilities for the next observer
 moment.

It's with the very notion of a continuer that
I've always had a problem. So let me ask you
a little about it. Now clearly, if ten minutes
from now the Earth Stathis is to be killed, but
a Martian duplicate is made five minutes from
now, you---the present Stathis---don't really
have a problem with that (that is, not a problem
that couldn't be fixed with a big bribe).

So next, let me ask about a new situation in 
which one hour from now you are to die here,
but a duplicate of you will the same second
be established on Mars, only this duplicate
has a little amnesia, and doesn't remember
the last half-hour of its life. Would that
be a continuer of you?  Would it be a 
continuer of you+60_minutes from now?
Will it only be a continuer of you+30_minutes
from now? Is it a continuer of the you-now?

How about this? For ten million dollars, would
you agree to have the last ten minutes of your
memory erased, where you are now?

 Given the existence of the rest of the multiverse, where all sorts of things 
 impossible for us here to know, let alone control, may be happening or (more 
 importantly) may in future happen to other versions of us, a general 
 argument can be made that it *is* helpful to increase our measure as much as 
 possible, as a sort of counterbalance to any terrible things that may be 
 happening to us elsewhere.

Yes, I agree.  But bringing in the multiverse may be a sort
of red-herring here, if you'll permit me to focus more on
just one universe for the sake of simplicity.

Lee



Re: More is Better (was RE: another puzzle)

2005-06-27 Thread Bruno Marchal


Lee Corbin wrote



Again, I think that the first-person point of view can lead to
errors just as incorrect as those of Ptolemaic astronomy.




I disagree because, almost by definition, the first-person point of 
view is incorrigible.
The error you point on is more subtle: it consists to communicate a 
first-person truth *as* it was objective. In that case we are lead to 
Ptolemaic-like errors.


Now, if comp is assumed we can justify that the belief in a 
substantial-physical universe, and/or  the belief that physics is the 
fundamental science,  are just such Ptolemaic-like mistakes, made at 
a much deeper level of explanation though.


Bruno


http://iridia.ulb.ac.be/~marchal/



RE: More is Better (was RE: another puzzle)

2005-06-27 Thread Stathis Papaioannou

Lee Corbin writes (replying to Jesse Mazer):


 Obviously my sadness is not because the death of the copy here
 means that there are only 10^10^29 - 1 copies of that person...

By the way, this figure 10^10^29 is a *distance*. It is, according
to Tegmark, very approximately how close in terms of meters the
nearest exact copy of you who is reading this is.  (And it doesn't
matter whether one uses meters or lightyears.)


A strange fact: when you get up to numbers that big, a light year is as many 
times bigger than a metre as it always is, but the double exponential 
notation makes the difference between the two look negligible. Can anyone 
say how widely accepted Tegmark's infinite universe model is amongst 
cosmologists?



Let me resort to another torture experiment. Suppose that I invite
you into my house, take you down to the torture chamber, and allow
you to look through a tiny peephole inside the entire steel-encased
chamber. You see some Nazis torturing a little girl, and her screams
are reproduced electronically so that you hear them.

You are appalled. You beg me to dissolve the chamber and put an end
to the atrocity. But then I say the following peculiar thing to you:
Ah, but you see, this is an *exact* molecular---down to the QM
details---reenactment of an incident that happened in 1945. So you
see, since it's identical, it doesn't matter whether the little girl
suffers once or twice.


This brings up an interesting conundrum that I raised three or four torture 
experiments ago. Given 10 instantiations of a person having an unpleasant 
experience E (in the terminology of mild-mannered Hal Finney, who eschews 
torture even in thought experiments), for example 10 sentient programs 
running in parallel, is it better, if we aim to reduce suffering, to (a) 
terminate 9 of the 10 programs and leave one still running and experiencing 
E, or (b) stop 5 of the 10 programs from experiencing E, but leave them 
running, and leave the other 5 programs continuing to experience E?
(Assume for the sake of argument that, as with dogs, painless termination is 
better than continuing to live with pain. Why we automatically assume this 
for dogs but not for humans is another question.)


If you do the total suffering equation assuming that each instantiation is 
separate, (a) is better. But I would argue that if you are one of the 
suffering victims, (a) does you no good at all: subjectively, you will 
continue to suffer, since the one remaining program that is running will 
serve as continuation for any of the 9 terminated ones. In fact, there is no 
way for someone inside the simulated system to know that any of the 
instantiations had been terminated, as long as at least one keeps running. 
On the other hand, with (b) there is subjectively a 50% probability that 
your suffering will end. If you simply added up the total number of 
instantiations and attempted to minimise the number experiencing E, you 
would be doing the victim(s) a great disservice. Whether you say there is 
one victim or 10 to begin with is a moot point, but the conclusion still 
stands.


--Stathis Papaioannou

_
Have fun with your mobile! Ringtones, wallpapers, games and more. 
http://fun.mobiledownloads.com.au/191191/index.wl




RE: another puzzle

2005-06-26 Thread Lee Corbin
Here is yet another delightful Stathis experiment that I fished up from
about ten days ago:

Hal wrote
 Stathis Papaioannou writes:
  You find yourself in a locked room with no windows, and no memory of how 
  you 
  got there. The room is sparsely furnished: a chair, a desk, pen and paper, 
  and in one corner a light. The light is currently red, but in the time you 
  have been in the room you have observed that it alternates between red and 
  green every 10 minutes. Other than the coloured light, nothing in the room 
  seems to change. Opening one of the desk drawers, you find a piece of paper 
  with incredibly neat handwriting. It turns out to be a letter from God, 
  revealing that you have been placed in the room as part of a philosophical 
  experiment. Every 10 minutes, the system alternates between two states. One 
  state consists of you alone in your room. The other state consists of 
  10^100 
  exact copies of you, their minds perfectly synchronised with your mind, 
  each 
  copy isolated from all the others in a room just like yours.

I'm liking this already.  So while the light is, say, green, then my
experiences are being multiplied---from the objective, scientific point
of view, i.e., the truthful view---by an astounding factor of 10^100!

  Whenever the light changes colour, it means that God is either 
  instantaneously
  creating (10^100 - 1) copies, or instantaneously destroying all but one 
  randomly 
  chosen copy.
 
  Your task is to guess which colour of the light corresponds with which 
  state 
  and write it down. Then God will send you home.

Of course, no instance will be able to say which is which. Thus I, taken as a
sort of collection of all my instances, cannot know and cannot say.

Hal says:

 Let me make a few comments about this experiment.  I would find it quite
 alarming to be experiencing these conditions.

What??  Alarming?

 When the light changes and I go from the high to the low measure state,
 I would expect to die. When it goes from the low to the high measure state,
 I would expect that my next moment is in a brand new consciousness (that
 shares memories with the old).  [Brand new consciousness? What is that?
 How is it different from the old one?]

First, Stathis has cleverly inserted that you don't know which is
which, so even the highly-anxious are spared. As for me, suppose
even that I knew. Then when the light went green, my joy would be
great (and in one hell of a lot of places!).  When it turned red,
I would be very philosophical about it and say, Aw, the best part
is off now, darn it. But I'm in certainly no worse shape than when
I came into this wonderful room.

 Although the near-certainty of death is balanced by the
 near-certainty of birth, it is to such an extreme degree that it seems
 utterly bizarre.  Conscious observers should not be created and destroyed
 so cavalierly, not if they know about it.

Why not, exactly?  I'm sure that you have considered (perhaps later
in the thread) the possibility that this is already happening to you
millions of times per second. So what?  Nothing to get excited enough
to write home about.

 Suppose you stepped out of a duplicating booth, and a guy walked up with
 a gun, aimed it at you, pulled the trigger and killed you.  Would you
 say, oh, well, I'm only losing two seconds of memories, my counterpart
 will go on anyway?

Yes.

 I don't think so, I think you would be extremely alarmed and upset at
 the prospect of your death.

Nah, not if I had been fully informed of the situation beforehand.
After all, I can even throw myself out of an airplane at high altitude
(with a parachute) even though it makes me extremely agitated. But 
since I *know* that I will be okay, it's not really a problem.

 God is basically putting you in this situation, but to an enormously,
 unimaginably vaster degree.  He is literally playing God with your
 consciousness.  I would say it's a very bad thing to do.

Okay, I agree it's unethical for Him to do it to some people. But He's
*not* really hurting them now, is he?  At no point are they any worse off.

 Well, so what?  What good is that?  Why do I care, given that I am
 going to die, what happens to the one in 10^100 part of me?  That's an
 inconceivably small fraction.
 
 In fact, I might actually prefer to have that tiny fraction stay in the
 room so I can be reborn.  Having 10^100 copies 50% of the time gives me
 a lot higher measure than just being one person.  I know I just finished
 complaining about the ethical problems of putting a conscious entity in
 this situation, but maybe there are reasons to think it's good.

Yes!  Now that's right!

 So I don't necessarily see that I am motivated to follow God's
 instructions and try to guess.  I might just want to sit there.
 And in any case, the reward from guessing right seems pretty slim
 and unmotivating.  Congratulations, you get to die.  Whoopie.

Well, would you be willing to pay much for it *to* happen, or
*not to* 

RE: another puzzle

2005-06-26 Thread Jesse Mazer

Lee Corbin wrote:


If I, on the other hand, knew that this wonderful room was going to
be available to me on a specific date, I would collect all my favorite
movies, my best books, some certain chemicals that it is best not to
describe in detail, and would look forward to the most wonderful
afternoon of my life.  I would enthusiastically pay a good fraction
of my net worth for this opportunity.

Why?  Why would I do it?  Because logic grabs me by the throat and
*forces* me to   :-)


What is the logic here exactly, though? From a third person point of view, 
why is it objectively better to have a lot of copies of you having 
identical good experiences than it is to have only a single copy have the 
same good experience? After all, if they lied to you and never made any 
copies at all, no version of you would ever know the difference.


Also, wouldn't the same logic tell you that if we lived in a utopian society 
where pretty much everyone was happy, it would be immoral to use birth 
control because we want to make the number of people having happy 
experiences as high as possible? That doesn't seem like a position that 
anyone who rejects first-person thinking would automatically accept.


Jesse




RE: another puzzle

2005-06-26 Thread Lee Corbin
Jesse writes

 Lee Corbin wrote:
 
  If I, on the other hand, knew that this wonderful room was going to
  be available to me on a specific date,... I would enthusiastically
  pay a good fraction of my net worth for this opportunity.
 
 Why?  Why would I do it?  Because logic grabs me by the throat and
 *forces* me to   :-)
 
 What is the logic here exactly, though? From a third person point of view, 
 why is it objectively better to have a lot of copies of you having 
 identical good experiences than it is to have only a single copy have the 
 same good experience?

First, I think that it's important to remove the qualifier identical
here. Would two copies cease to be identical if one atom were out of
place?  Hardly.  On another tack, you are the same person, etc., that
you were five minutes ago where strict identicalness isn't even close.

Second, suppose that someone loves you, and wants the best for you.
There are a number of ways to describe this, but the infinite level
one universe is a good one. The person who loves you (and so has such
a true 3rd person point of view) sees you die here on Earth, and is
sad for you. Yet she understands totally that you are still alive in
many, many places 10^10^29 from here. Her most logical retort is that
you should be alive *here* too; that an extra Jesse here is simply
good for Jesse, no matter what is going on far away.

If she finds out that although dead on Earth, you've been copied into
a body out near Pluto, (and have the same quality of life there), she's
once again happy for you.

 After all, if they lied to you and never made any copies at all,
 no version of you would ever know the difference.

Well, lots of things can go better or worse for me without me
being informed of the difference. Someone might perpetrate a
scam on me, for example, that cheated me of some money I'd
otherwise get, and it is still bad for me even if I don't know
about it.

So this is why I subscribe to the descriptions involving measure;
namely, it's better when my measure goes up, and worse when it
doesn't.

 Also, wouldn't the same logic tell you that if we lived in a utopian society 
 where pretty much everyone was happy, it would be immoral to use birth 
 control because we want to make the number of people having happy 
 experiences as high as possible?

Yes, exactly.  Each time we can rescue someone from non-existence,
we should (given that other things are equal).

 That doesn't seem like a position that anyone who rejects first-person
 thinking would automatically accept.

They may not. But the two great moral revolutions/revelations of my 
life--- (i) cryonics  (ii) the Hedonistic Imperative (www.hedweb.com)
--- lay down that life is better than death, and pleasure is better
than pain.  And that we need not be shy about uniformizing and
extending these concepts as far as we can.

Lee



RE: another puzzle

2005-06-26 Thread Jesse Mazer

Lee Corbin wrote:


Jesse writes

 Lee Corbin wrote:

  If I, on the other hand, knew that this wonderful room was going to
  be available to me on a specific date,... I would enthusiastically
  pay a good fraction of my net worth for this opportunity.
 
 Why?  Why would I do it?  Because logic grabs me by the throat and
 *forces* me to   :-)

 What is the logic here exactly, though? From a third person point of 
view,

 why is it objectively better to have a lot of copies of you having
 identical good experiences than it is to have only a single copy have 
the

 same good experience?

First, I think that it's important to remove the qualifier identical
here. Would two copies cease to be identical if one atom were out of
place?


I meant something more like running the same program--I was thinking in 
terms of minds running on computers, since that's the only way to insure all 
the copies run in lockstep. If you're talking about ordinary physical 
copies, the butterfly effect probably insures their behavior and experiences 
will diverge in a noticeable macroscopic way fairly quickly.



Hardly.  On another tack, you are the same person, etc., that
you were five minutes ago where strict identicalness isn't even close.


From a third-person POV, why am I the same person? If you don't believe 
there's an objective truth about continuity of identity, isn't it just a 
sort of aesthetic call?




Second, suppose that someone loves you, and wants the best for you.
There are a number of ways to describe this, but the infinite level
one universe is a good one. The person who loves you (and so has such
a true 3rd person point of view) sees you die here on Earth, and is
sad for you. Yet she understands totally that you are still alive in
many, many places 10^10^29 from here. Her most logical retort is that
you should be alive *here* too; that an extra Jesse here is simply
good for Jesse, no matter what is going on far away.

If she finds out that although dead on Earth, you've been copied into
a body out near Pluto, (and have the same quality of life there), she's
once again happy for you.


That's a pretty unhuman kind of love though--if a person I know dies, I'm 
sad because I'll never get to interact with them again, the fact that 
versions of them may continue to exist in totally unreachable parallel 
universes isn't much comfort. Obviously my sadness is not because the death 
of the copy here means that there are only 10^10^29 - 1 copies of that 
person rather than 10^10^29 copies.


By the same token, the relief I'd feel knowing that there's a backup copy 
who's living on Pluto has to do with the fact that the potential for meeting 
and interacting now exists again, that all the information in my friend's 
brain hasn't been lost to me forever. But if I find out that there are *two* 
backup copies running in lockstep on Pluto, that doesn't make me any happier 
than I was before, in fact I wouldn't feel a twinge of sadness if one of the 
copies was deleted to save room on the Plutonian hard drive.




 After all, if they lied to you and never made any copies at all,
 no version of you would ever know the difference.

Well, lots of things can go better or worse for me without me
being informed of the difference. Someone might perpetrate a
scam on me, for example, that cheated me of some money I'd
otherwise get, and it is still bad for me even if I don't know
about it.


OK, in that case there are distinct potential experiences you might have had 
that you now won't get to have. But in the case of a large number of copies 
running in lockstep, there are no distinct experiences the copies will have 
that a single copy wouldn't have.




 Also, wouldn't the same logic tell you that if we lived in a utopian 
society

 where pretty much everyone was happy, it would be immoral to use birth
 control because we want to make the number of people having happy
 experiences as high as possible?

Yes, exactly.  Each time we can rescue someone from non-existence,
we should (given that other things are equal).


But if there's already at least one instantiation of a particular program, 
should we count additional instantiations as distinct someones? Aren't 
they just multiple copies of the same someone, so as long as there's at 
least one copy, that someone exists?




 That doesn't seem like a position that anyone who rejects first-person
 thinking would automatically accept.

They may not. But the two great moral revolutions/revelations of my
life--- (i) cryonics  (ii) the Hedonistic Imperative (www.hedweb.com)
--- lay down that life is better than death, and pleasure is better
than pain.


But neither of these necessarily justify the idea that redundant copies of 
the same programs are better than single copies of it.


The real problem here is that when we talk about what's better from a 
first-person POV, we just mean what *I* would prefer to experience happening 
to me; but if you want to only think in terms of a 

More is Better (was RE: another puzzle)

2005-06-26 Thread Lee Corbin
Jesse writes

  First, I think that it's important to remove the qualifier identical
  here. Would two copies cease to be identical if one atom were out of
  place?
 
 I meant something more like running the same program

Okay, that's fine.

  On another tack, you are the same person, etc., that you were
  five minutes ago where strict identicalness isn't even close.
 
 From a third-person POV, why am I the same person? If you don't believe 
 there's an objective truth about continuity of identity, isn't it just a 
 sort of aesthetic call?

When we say that you are the same person you were a few
minutes ago, of course, we are starting from common usage
and going from there. Normal people value their lives, 
and don't want to die, say, next week. Even legally, people
are regarded as having an identity that doesn't change much
over time.

Objectively, (i.e. 3rd person), there really *is* a fuzzy set 
of states that ought to be regarded as Jesse Mazur. Any 
intelligent investigator (or even a program that we cannot
quite write yet) could examine each of the six billion people
in the world and give a Yes or No answer to whether this
is an instance of Jesse Mazur. Naturally in the case of duplicates
(running on computers or running on biological hardware doesn't
matter) it may be found that there is more than one Jesse running.

It's *not* aesthetic whether, say, George Bush is you or not. He's
definitely not! He doesn't have your memories, for the first thing.
It's simply objectively true that some programs---or some clumps
of biological matter---are Jesse Mazur and others are not. (Even
though the boundary will not be exact, but fuzzy.)

  Second, suppose that someone loves you, and wants the best for you.
  The person who loves you...
  If she finds out that although dead on Earth, you've been copied into
  a body out near Pluto, (and have the same quality of life there), she's
  once again happy for you.
 
 That's a pretty unhuman kind of love though--if a person I know dies, I'm 
 sad because I'll never get to interact with them again,

Then you don't know true love  :-)  (just kidding) because as
the great novelists have explained, truly loving someone involves
wanting what is best for *them*, not just that you'll get the
pleasure of their company. Hence the examples where one lover
dies to save the other.

 Obviously my sadness is not because the death of the copy here
 means that there are only 10^10^29 - 1 copies of that person...

By the way, this figure 10^10^29 is a *distance*. It is, according
to Tegmark, very approximately how close in terms of meters the
nearest exact copy of you who is reading this is.  (And it doesn't
matter whether one uses meters or lightyears.)

  Well, lots of things can go better or worse for me without me
  being informed of the difference. Someone might perpetrate a
  scam on me, for example, that cheated me of some money I'd
  otherwise get, and it is still bad for me even if I don't know
  about it.
 
 OK, in that case there are distinct potential experiences you might have had 
 that you now won't get to have. But in the case of a large number of copies 
 running in lockstep, there are no distinct experiences the copies will have 
 that a single copy wouldn't have.

I am speaking even of the case you bring up where the experiences
are *exactly* alike, although, as I say, for physical copies a
few atoms (or even many) doesn't matter much.

This, then, is the big question: how may I appeal to your intuition
in such a way that you come to agree that benefit is strictly additive?

Let me resort to another torture experiment. Suppose that I invite
you into my house, take you down to the torture chamber, and allow
you to look through a tiny peephole inside the entire steel-encased
chamber. You see some Nazis torturing a little girl, and her screams
are reproduced electronically so that you hear them.

You are appalled. You beg me to dissolve the chamber and put an end
to the atrocity. But then I say the following peculiar thing to you:
Ah, but you see, this is an *exact* molecular---down to the QM
details---reenactment of an incident that happened in 1945. So you
see, since it's identical, it doesn't matter whether the little girl
suffers once or twice.

Now contrive translated versions of that to programs, where a program
here is suffering exactly the same way that it's suffering on Mars.
Still feel that since one is taking place anyway, it doesn't matter
whether a second one is?

The love of a mother who understood all the facts would not mislead
her into making the correct decision: all other things being equal,
(say that her daughter was to live happily in any case after 2007),
she would judge that it is better for her daughter to suffer only
one computation---here, say---than two, (say here and on Mars).
Each time that the girl's suffering is independently and causally
calculated is a terrible thing.

It is this last sentence that well sums up the entire objective

RE: More is Better (was RE: another puzzle)

2005-06-26 Thread Jesse Mazer

Lee Corbin wrote:



Jesse writes

  First, I think that it's important to remove the qualifier identical
  here. Would two copies cease to be identical if one atom were out of
  place?

 I meant something more like running the same program

Okay, that's fine.

  On another tack, you are the same person, etc., that you were
  five minutes ago where strict identicalness isn't even close.

 From a third-person POV, why am I the same person? If you don't believe
 there's an objective truth about continuity of identity, isn't it just a
 sort of aesthetic call?

When we say that you are the same person you were a few
minutes ago, of course, we are starting from common usage
and going from there. Normal people value their lives,
and don't want to die, say, next week. Even legally, people
are regarded as having an identity that doesn't change much
over time.

Objectively, (i.e. 3rd person), there really *is* a fuzzy set
of states that ought to be regarded as Jesse Mazur.


MazEr!

Any

intelligent investigator (or even a program that we cannot
quite write yet) could examine each of the six billion people
in the world and give a Yes or No answer to whether this
is an instance of Jesse Mazur. Naturally in the case of duplicates
(running on computers or running on biological hardware doesn't
matter) it may be found that there is more than one Jesse running.

It's *not* aesthetic whether, say, George Bush is you or not. He's
definitely not! He doesn't have your memories, for the first thing.
It's simply objectively true that some programs---or some clumps
of biological matter---are Jesse Mazur and others are not. (Even
though the boundary will not be exact, but fuzzy.)


I disagree--George Bush certainly has a lot sensory memories (say, what 
certain foods taste like) in common with me, and plenty of 
life-event-memories which vaguely resemble mine. And I think if you scanned 
the entire multiverse it would be possible to find a continuum of minds with 
memories and lives intermediate between me and George Bush. There's not 
going to be a rigorous, totally well-defined procedure you can use to 
distinguish minds which belong to the set Jesse-Mazer-kinda-guys from 
minds which don't belong.




  Second, suppose that someone loves you, and wants the best for you.
  The person who loves you...
  If she finds out that although dead on Earth, you've been copied into
  a body out near Pluto, (and have the same quality of life there), 
she's

  once again happy for you.

 That's a pretty unhuman kind of love though--if a person I know dies, 
I'm

 sad because I'll never get to interact with them again,

Then you don't know true love  :-)  (just kidding) because as
the great novelists have explained, truly loving someone involves
wanting what is best for *them*, not just that you'll get the
pleasure of their company. Hence the examples where one lover
dies to save the other.


Yeah, but that's because those guys only believed in a single universe! Do 
you think anyone would buy a story where someone sacrificed their unique, 
unbacked-up life to save copy #348 of 1000 copies running in perfect 
lockstep? Do *you* think this would be a good thing to do, even though it 
would mean the loss of unique information (all the person's memories, 
thoughts, wisdom etc.) from the universe in order to prevent a death that 
won't remove any unique information at all from the universe?


From a first-person POV, I do believe the concept of self-sacrificing 
unselfish love still makes sense in a multiverse, it's just that it would 
involve trying to maximize the other person's subjective probability of 
experiencing happiness in the future.



This, then, is the big question: how may I appeal to your intuition
in such a way that you come to agree that benefit is strictly additive?

Let me resort to another torture experiment. Suppose that I invite
you into my house, take you down to the torture chamber, and allow
you to look through a tiny peephole inside the entire steel-encased
chamber. You see some Nazis torturing a little girl, and her screams
are reproduced electronically so that you hear them.

You are appalled. You beg me to dissolve the chamber and put an end
to the atrocity. But then I say the following peculiar thing to you:
Ah, but you see, this is an *exact* molecular---down to the QM
details---reenactment of an incident that happened in 1945. So you
see, since it's identical, it doesn't matter whether the little girl
suffers once or twice.


Well, of course *I* would want to dissolve the chamber, because I think that 
dissolving this chamber will decrease the subjective first-person 
probability of having that experience of being tortured by the Nazis. I'm 
just saying it's not clear what difference dissolving it would make from the 
POV of a zombie like yourself. ;)



Now contrive translated versions of that to programs, where a program
here is suffering exactly the same way that it's suffering on Mars.
Still feel that 

RE: More is Better (was RE: another puzzle)

2005-06-26 Thread Lee Corbin
Jesse writes

  It's *not* aesthetic whether, say, George Bush is you or not. He's
  definitely not! He doesn't have your memories, for the first thing.
  It's simply objectively true that some programs---or some clumps
  of biological matter---are Jesse Mazur and others are not. (Even
  though the boundary will not be exact, but fuzzy.)
 
 I disagree--George Bush certainly has a lot sensory memories (say,
 what certain foods taste like) in common with me, and plenty of 
 life-event-memories which vaguely resemble mine. And I think if
 you scanned the entire multiverse it would be possible to find a
 continuum of minds with memories and lives intermediate between
 me and George Bush.

Of course. And that's true of anything you care to name (outside
mathematics and perhaps some atomic physics, I suppose).

 There's not going to be a rigorous, totally well-defined procedure
 you can use to distinguish minds which belong to the set Jesse-
 Mazer-kinda-guys from minds which don't belong.

I never said that there was. The allure of mathematically
precise, absolutely decisive categories must be resisted
for most things. Don't throw out what is important: namely
that there are such things as *stars* which are different
from *planets*, even though (of course, like everything)
there is a continuum.

We have tests which today can pick out Jesse Mazer from all
other humans, six billion or so, that live on the planet.
Even before we knew about DNA, it was possible for it do be
determined on Earth in the year 1860 who was and who was not
Abraham Lincoln.

 Well, of course *I* would want to dissolve the chamber,
 [where an exact re-enactment was taking place]
 because I think that dissolving this chamber will decrease
 the subjective first-person probability of having that
 experience of being tortured by the Nazis.

Me too.

 I'm just saying it's not clear what difference dissolving
 it would make from the POV of a zombie like yourself. ;)

Your meaning is unclear. But you may wish to just elide all this.

  The love of a mother who understood all the facts would not mislead
  her into making the correct decision: all other things being equal,
  (say that her daughter was to live happily in any case after 2007),
  she would judge that it is better for her daughter to suffer only
  one computation---here, say---than two, (say here and on Mars).
  Each time that the girl's suffering is independently and causally
  calculated is a terrible thing.
 
 I don't see why it's terrible, if you reject the notion of first-person 
 probabilities. You've really only given an appeal to emotion rather than
 an argument here, and I would say the emotions are mostly grounded in 
 first-person intuitions, even if you don't consciously think of them that 
 way.

It's true that all my beliefs are analytically-continued from
my intuitions. I think that everyone's are. I've tried to find
an entirely consistent objective description of my values. It
seems to me that I have an almost entirely consistent version
of the values that a lot of people share (but then, I'm biased).

It started like this: I know what it's like for me to have a
bad experience, and when I then look at the physics I understand
that there is a certain organism that is causally going through
a number of states, and that it results in a process I don't 
like.  Conveniently, my instincts also suggest that I shouldn't
like it when other people suffer too. Dropping the error-prone
first person account, I then generalize on what is intrinsically
bad about people suffering to a wider view that includes programs
and processes. It wasn't rocket science, and many others have done
so just as I have.

   but if you want to only think in terms of a universal 
   objective third-person POV, then you must define better
   in terms of some universal objective moral system, and
   there doesn't seem to be any objective way to
   decide questions like whether multiple copies of the
   same happy A.I. are better than single copies.
 
  You're right, and here's how I go about it. We must be able to decide
  (or have an AI decide) whether or not an entity (person or program)
  is being benefited by a particular execution.
 
 But that's ignoring the main issue, because you haven't addressed the more 
 primary question of *why* we should think that if a single execution of a 
 simulation benefits the entity being simulated, then multiple executions of 
 the same simulation benefit it even more.

I have given some reasons, namely, it's a smooth extension of
our values from the cases of how we'd feel on seeing repeated
suffering. But it's understandable and correct for you to be
asking for more.

Let's suppose that we want an evaluation function, that is, a
way of pronouncing judgment on the issues of the day, or of
philosophic choices, or of other things that ask us our values.
A main purpose of philosophy, in my opinion, is prescriptive:
It should tell us how to choose in various situations.