Stathis writes > >How about this? For ten million dollars, would > >you agree to have the last ten minutes of your > >memory erased, where you are now? > > These are all interesting questions that have bothered me for a long time. I > think the most useful suggestion I can make about how to decide whether > other versions of a person are or aren't "continuers" or the "same person" > is to avoid a direct answer at all and ask - as you have done - how much > memory loss a person would tolerate before they felt they would not be the > "same" person.
Yes, it's a key question. All we can assert for sure is that gradually between 0% and 100% it pays less and less to think that you're the same person. The same answer obtains if we measure in terms of how much recent memory could be lost: on this, I happen to feel that I am about fifty percent the same person at 18 that I am now. > 10 minutes of memory loss for ten million dollars is an offer > I would definitely take up; in fact, people pay > to get drunk on a Friday night and suffer more memory loss than this. Right. > Before you ask, this raises another interesting question: would I agree for > the same amount of money to be painlessly killed 10 minutes after being > duplicated? Given that I believe my duplicate provides seamless continuity > of consciousness from the point of duplication, this should be the same as > losing 10 minutes of memory. However, I would probably balk at being > "killed" if it were happening for the first time, and I might hesitate even > if I knew that it had happened to me many times before. Well, the biggest point of philosophy for me is that it be prescriptive. Suppose that you have to figure out all this ahead of time---or would you prefer just to go with a gut instinct when the time comes? > Yet another variation: for 10 million dollars, would you agree to undergo a > week of excruciating pain, and then have the memory of the week wiped? No. > What if you remember agreeing to this 100 times in the past; that is, you > remember agreeing to it, then a moment later experiencing a slight > discontinuity, and being given the ten million dollars (which let's say you > gambled all away). This is a horrific situation that I would put a stop to if I could. One of my old thought experiments was that you start noticing a $1000 increase in your bank account every day. Then after a month or so, you learn that you are every night being awakened and tortured for an hour, and then the memory is erased. My way of looking at it provides a clear answer: you are your duplicates, and so just because you don't remember something doesn't mean that it didn't happen (or isn't happening) to you. > These are not trivial questions. The basic problem is that our minds have > evolved in a world where there is no copying and no memory loss (memory loss > may have occurred naturally, of course, but evolution's answer to it would > have been to wipe out the affected individual and their genes), so there is > a mismatch between reason and intuition. Well, it's time to at least be verbally able to prescribe what one would do. The flat, linear model suggests that more good runtime for me is good, less is worse, and bad runtime is worst of all. I think that if it is given that either you or your duplicate must die, then you should willingly sacrifice yourself if it will enrich your duplicate. Either way, I think you wake up the next morning very satisfied with the outcome. Lee