On Jul 4, 2005, at 8:11 AM, Lee Corbin wrote:

You think that person A ought (in the ethical sense) to have a strong  
desire for the future existence of person B - no less, in fact, than  
for the future existence of person A.  You imply this when you say  
the subject is selfish.  I see your point, that normally we have a  
strong desire for the future existence of -- the person who will wake  
up in our bed tomorrow.


Hmm?  You are still seeing that I'm making an *ethical* statement
here somehow?   Well, I suppose that in some sense highly selfish
behavior could conceivably be described as ethical in some sense,
but it's sure confusing.

The statement of what a person should or shouldn't do falls under the domain of ethics.  When you say

"definitely in the case of very close copies, to be
consistent one should to the greatest degree he can
extend the boundary to include close duplicates."

You're making a normative statement.  I was arguing that one's intuitions will likely pull the other way.  You may say that "your duplicate is you", but it is undeniable that there are two organisms present, and an organism normally acts in such a way to prevent damage to its body, and as you say, these instincts are forged by evolution.  These instincts form the basis of our ethical intuitions.  Your wish for "consistency" would seem to be in opposition to how most people's instincts would lead them to behave.  

What would the Lee who stands to receive $5 in my experiment say to the Lee who is observing in a remote room, pondering which choice to make?  "Please kill yourself so that I might live; after all, I'll have $5 more than you and so will be slightly better off.  But, if you do decide to kill me instead, I won't mind so much, since $5 isn't really that much money." ?  Can we really imagine people saying these things without previously carrying out some intense philosophical gymnastics?

I don't know; I think Stathis has a good point that this duplication isn't really possible so all the conclusions we're drawing from it might be suspect - and entities that are duplicatable might have vastly different intuitions about what is moral and what is not.  

Reply via email to