The arguably more interesting version of Ellsberg's paradox has balls of three different colors in the urn: 30 reds, and 60 that are some combination of blue and yellow. A ball is drawn.
In situation A, you get to choose between betting on red and betting on yellow (you get $1 if you guess right and 0 otherwise). In situation B, you get to choose between between on red+blue or betting on yellow+blue. (If you bet on red+blue, you get a dollar if the ball drawn is either red or blue). Most people choose to bet on red in the first case and blue+yellow in the second. That's inconsistent with having a subjective probability on the balls (no matter what your attitude is to risk). Note that, unlike your discussion below, there's only one decision. There is no second decision. -- Joe >From [EMAIL PROTECTED] Mon Aug 4 12:18:00 2003 Date: Mon, 4 Aug 2003 18:16:17 +0000 (UTC) From: Konrad Scheffler <[EMAIL PROTECTED]> X-X-Sender: [EMAIL PROTECTED] To: Joseph Halpern <[EMAIL PROTECTED]> cc: [EMAIL PROTECTED] Subject: Re: [UAI] Allais' paradox MIME-Version: 1.0 > There seems to be some dispute about how compelling the Allais Paradox > is. Savage said that he violated it the first time the example was > given, but decided he made a mistake and revised his beliefs. > Most economists I know have a similar reaction. (Although, as a > personal matter, I find it somewhat more compellig.) However, I do not > know of any economist who does not find Ellsberg's Paradox compelling. > They agree that even after having the problem pointed out, they wouldn't > change their beliefs. -- Joe Hmm, I'm afraid Ellsberg's paradox does not convince me either - I reproduce the version I'm referring to below. At the risk of sounding naive (after all many people have spent years analysing this, so who am I to just dash off a few lines, being unfamiliar with their analysis?) here is how I view it: At the first decision, the choice is equal if we assume an equal prior on the unknown ball distribution. However, I'll tend to distrust the person running the game, assigning a prior which favours the hypothesis that he played safe and used a small proportion of reds. Thus I go for 50-50. The game-master then turns round and offers me an alternative. At this point, I change my prior assumptions: obviously this game-master is more devious than I expected! My prior will now either be even or perhaps tend towards favouring black (I'm wondering what this guy has up his sleeve...) so again I go for 50-50. So I don't think this violates the principle of maximising expected utility. Ellsberg's Paradox: - ------------------- The paradox arises from a series of games involving colored balls in urns. Let's say there are two urns, each of which contains a hundred balls, which are either red or black. One urn contains fifty red balls and fifty black balls. The proportion of red and black in the other urn is unknown. You can draw one ball from one of the urns, without looking, and if you draw a red ball you win a hundred dollars. Which urn will you choose? There is no good reason to think that the chance of getting a red ball is any better in one urn than in the other, but Ellsberg found that people overwhelmingly chose the urn known to have fifty balls of each color. The person running the game would then say, "O.K., you think that urn is likelier to have a red ball; now I'm going to offer you a hundred dollars if you draw a black ball." If you turned to the fifty-fifty urn for the red ball, it would seem you had a hunch that the other urn contained more black balls, and therefore you should try to draw your black ball from it. But, overwhelmingly, people chose the fifty-fifty urn again. The Ellsberg paradox is that people so strongly prefer definite information over ambiguity that they make choices consistent neither with the laws of probability nor with themselves. Konrad
