-------- Original Message --------
Subject: Re: Observation selection effects
Date: Sat, 04 Sep 2004 02:29:54 -0400
From: Danny Mayes <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]
References: <[EMAIL PROTECTED]>


These problems remind me of the infamous Monty Hall problem that got 
Marilyn vos Savant in some controversy.  Someone wrote and asked the 
following question:

You are on "lets make a deal", and are chosen to select a door among 3 
doors, one of which has a car behind it.  You randomly select door 1.  
Monty, knowing where the car is, opens door 2 revealing an empty room, 
and asks if you want to stay with door one.  The question was:  Is there 
any benefit in switching from door 1 to door 3.  Common sense would 
suggest Monty simply eliminated one choice, and you have a 50-50 chance 
either way.  Marylin argued that by switching, the contestant actually 
increases his odds from 1/3 to 2/3.  The difference coming about through 
the added information of the car not behind door 2.  This example is 
discussed in the book "Information:  The New Language of Science" by 
Hans Christian von Baeyer, which I am trying to read, but only getting 
through bits and pieces as usual due to my work schedule.  According to 
the book, vos Savant still gets mail arguing her position on this 
matter.  It seems to me it would be very easy to resolve with a friend, 
letting one person play Monty and then keeping a tally of your success 
in switching vs. not switching (though I haven't tried this- my wife 
didn't find it intriguing enough, unfortunately).

I think these games provide good examples of how our common sense often 
works against a deep understanding of what is really going on around 
here.  I also think they point to a very fundamental level of importance 
of the role of information in understanding the way our world  (or 
multiverse) works....



Jesse Mazer wrote:

> Norman Samish:
>
>> The "Flip-Flop" game described by Stathis Papaioannou strikes me as a
>> version of the old Two-Envelope Paradox.
>>
>> Assume an eccentric millionaire offers you your choice of either of two
>> sealed envelopes, A or B, both containing money.  One envelope contains
>> twice as much as the other.  After you choose an envelope you will 
>> have the
>> option of trading it for the other envelope.
>>
>> Suppose you pick envelope A.  You open it and see that it contains $100.
>> Now you have to decide if you will keep the $100, or will you trade 
>> it for
>> whatever is in envelope B?
>>
>> You might reason as follows: since one envelope has twice what the 
>> other one
>> has, envelope B either has 200 dollars or 50 dollars, with equal
>> probability.  If you switch, you stand to either win $100 or to lose 
>> $50.
>> Since you stand to win more than you stand to lose, you should switch.
>>
>> But just before you tell the eccentric millionaire that you would 
>> like to
>> switch, another thought might occur to you.  If you had picked 
>> envelope B,
>> you would have come to exactly the same conclusion.  So if the above
>> argument is valid, you should switch no matter which envelope you 
>> choose.
>>
>> Therefore the argument for always switching is NOT valid - but I am 
>> unable,
>> at the moment, to tell you why!
>>
>
> Basically, I think the resolution of this paradox is that it's 
> impossible to pick a number randomly from 0 to infinity in such a way 
> that every number is equally likely to come up. Such an infinite flat 
> probability distribution would lead to paradoxical conclusions--for 
> example, if you picked two positive integers randomly from a flat 
> probability distribution, and then looked at the first integer, then 
> there would be a 100% chance the second integer would be larger, since 
> there are only a finite number of integers smaller than or equal to 
> the first one and an infinite number that are larger.
>
> For any logically possible probability distribution the millionaire 
> uses, it will be true that depending on what amount of money you find 
> in the first envelope, there won't always be an equal chance of 
> finding double the amount or half the amount in the other envelope. 
> For example, if the millionaire simply picks a random amount from 0 to 
> one million to put in the first envelope, and then flips a coin to 
> decide whether to put half or double that in the other envelope, then 
> if the first envelope contains more than one million there is a 100% 
> chance the other envelope contains less than that.
>
> For a more detailed discussion of the two-envelope paradox, see this 
> page:
>
> http://jamaica.u.arizona.edu/~chalmers/papers/envelope.html
>
> I don't think the solution to this paradox has any relation to the 
> solution to the flip-flop game, though. In the case of the flip-flop 
> game, it may help to assume that the players are all robots, and that 
> each player can assume that whatever decision it makes about whether 
> to switch or not, there is a 100% chance that all the other players 
> will follow the same line of reasoning and come to an identical 
> decision. In this case, since the money is awarded to the minority 
> flip, it's clear that it's better to switch, since if everyone 
> switches more of them will win. This problem actually reminds me more 
> of Newcomb's paradox, described at http://slate.msn.com/?id=2061419 , 
> because it depends on whether you assume your choice is absolutely 
> independent of choices made by other minds or if you should act as 
> though the choice you make can "cause" another mind to make a certain 
> choice even if there is no actual interaction between you.
>
> Jesse
>
>
>
>

Reply via email to