On Tue, Jul 16, 2002 at 06:58:50PM -0700, Hal Finney wrote:
> I am confused about the relation of S to A and B.  Did S go into a
> copying machine and get two copies, A and B made, in addition to S?
> And now A and B are deciding what S will win?

Yes, and yes.

> Why should they care?  If S gets a TV that does not benefit them.  Is it
> just that they are similar to S, being recent copies of him, so they
> have a sort of brotherly fondness for him and would like to see him happy?


> I thought the earlier experiments had a more direct connection, where
> the people making the decisions were the same ones who were getting
> the reward (or at least, future copies of themselves)

The same reasoning applies to the earlier experiments as well. I thought 
this is simpler because it removes the issue of whether the value of an 
experience depend the experiences of one's copies. 

BTW, evolution programmed us to value certain experiences, even if those
experiences no longer reliably indicate increases in inclusive fitness.
This is not going to last very long if we remain in an evolutionary
regime. Those who value only experiences that reliably indicate increases
in inclusive fitness will have an evolutionary advantage. That means the
same experience will have very different values depending on the subject's
background knowledge. For example the experience of eating a delicious
meal would not be valued if it's known that the experience is a computer
generated illusion and and no actual nutrition in being gained. In my
previous thought experiments, the experiences that were considered rewards
did not indicate increases in inclusive fitness. That doesn't mean it's
irrational to value them, just that most people in the future probably
will not value them.

Similarly, if copying becomes possible, then people who care greatly about 
their copies will also have an evolutionary advantage.

> I'm not sure I understand this; since the payoff matrix is symmetric it
> doesn't matter if you are A or B so I don't see what the point is of
> introducing amnesia.  Would there be any cases with symmetric payoffs
> where an amnesiac would behave differently than someone who knew whether
> he was A or B?

(*) I think they *shouldn't* behave differently. But if you consider
yourself to be both A and B when you don't know whether you are A or B,
then you *would* behave differently and choose Cooperate instead of
Defect. That's why I think it's wrong to consider yourself to be both A
and B.

> Tangentially, it seems that the PD is a case where the "evidential"
> vs "causal" principles of decision theory would show a difference.
> The evidentialist would argue that by cooperating, it increases the
> chance that the other person will cooperate (perhaps to a certainty, in
> some versions), hence cooperating can be justified.  This is basically
> Hofstadter's principle of super-rationality.  The causalist would reject
> the possible correlation of choices and choose the dominant strategy
> of defecting.  Does that seem correct?

There is some literature on this connection between PD and 
evidential vs causal decision theory. For example:

Lewis, D.: 1979, 'Prisoner's Dilemma is a Newcomb's Problem'. In 
Campbell and Sowden: 251-255. Originally in Philosophy and Public Affairs 
8, 235-240.


Christoph Schmidt-Petri 

Personally I think they are seperate issues. Causal vs evidential is about 
one-agent decision theory, and PD is about multi-agent decision theory 
(i.e. game theory). It doesn't make sense to use one-agent decision theory 
to analyze PD.

BTW, I'm now having doubts about causal decision theory. Perhaps the extra
generality isn't really needed. See Huw Price's AGENCY AND PROBABILISTIC
http://www.usyd.edu.au/philosophy/price/preprints/AgencyPC.pdf for an
argument against causal decision theory. I will try to summarize my own 
thoughts on this matter in another post.

> What about this: you are going to be copied, and your two copies are going
> to play a PD game.  You know this ahead of time and so you can decide
> on whatever strategies you intend to follow during this time before the
> copying occurs.  Do you think in that case it would be rational to firmly
> decide beforehand to cooperate?

I guess you're assuming that your copies are not going to care about each 
other, but it's possible to commit yourself to cooperate before you're 
copied? In that case I think it would be rational to make this commitment.

> And would "amnesia" make a difference?  We might arrange for amnesia by
> having the duplicates immediately play the game, without any knowledge
> of which they are; and remove the amnesia by simply telling them that
> one is A and one is B, before they play.  Again, with a symmetric game
> I don't see how the amnesia or its absence would be relevant.  Maybe I
> am misunderstanding that aspect.

See the paragraph marked (*) above.

Reply via email to