On 5/3/2010 11:08 AM, Jesse Mazer wrote:

## Advertising

On Sat, May 1, 2010 at 8:26 PM, Rex Allen <rexallen...@gmail.com<mailto:rexallen...@gmail.com>> wrote:On Sat, May 1, 2010 at 7:37 PM, Brent Meeker <meeke...@dslextreme.com <mailto:meeke...@dslextreme.com>> wrote: > > Sure we can, because part of the meaning of "random", the very thing that > lost us the information, includes each square having the same measure for > being one of the numbers. If, for example, we said let all the "1"s come > first - in which case we can't hit any "not-1"s, that would be inconsistent > with saying we didn't have any information. We have two things here. Random. And infinite. Three things actually. My random aim. An infinite row of squares. And each square's randomly assigned number lying between 1 and 6. If, due to the nature of infinity, there are the same number of 1's and not-1's, then I'd expect the probability of hitting a 1 to be 50-50. But, there are also the same number of 1's and even numbers. And the same number of evens and odds. And the same number of 1's and 2's. And the same number of 2's and not-2's. AND...I have the *random* aim of the dart that I'm throwing at the row. So it's not a question of saying which number is likely to be next in a sequence. Rather, the question is which number am I likely to hit on this infinite row of squares. SO, I think we have zero information that we can use to base our probability calculation on. Because of the counting issues introduced by the infinity combined with the lack of pattern. There is no usable information.Mathematicians do apparently have a well-defined notion of the"frequency" of different possible finite sequences (includingone-digit sequences) in an infinite digit sequence. For example, seethe article athttp://www.lbl.gov/Science-Articles/Archive/pi-random.html which talksabout attempts by mathematicians to prove that the digit sequence ofpi has a property called "normality", which means that any n-digitsequence should appear with the same frequency as every other n-digitsequence (so in base 2, it would imply that the 2-digit sequences 00,01, 10 and 11 all appear equally frequently in the infinite sequence):'Describing the normality property, Bailey explains that "in thefamiliar base 10 decimal number system, any single digit of a normalnumber occurs one tenth of the time, any two-digit combination occursone one-hundredth of the time, and so on. It's like throwing a fair,ten-sided die forever and counting how often each side or combinationof sides appears."''Pi certainly seems to behave this way. In the first six billiondecimal places of pi, each of the digits from 0 through 9 shows upabout six hundred million times. Yet such results, conceivablyaccidental, do not prove normality even in base 10, much lessnormality in other number bases.''In fact, not a single naturally occurring math constant has beenproved normal in even one number base, to the chagrin ofmathematicians. While many constants are believed to be normal --including pi, the square root of 2, and the natural logarithm of 2,often written "log(2)" -- there are no proofs.'So while it hasn't been proved, it sounds like it's at least awell-defined notion (and the article discusses some approaches toproving it which show some promise). Perhaps it means that if you lookat the frequencies of different n-digit sequences in the first Ndigits of a number, the frequencies all approach equality in the limitas N goes to infinity. It would presumably be possible to findinfinite sequences that *aren't* "normal" in this sense, like.011011011011...(Meanwhile, note that the naive idea of just picking a digit randomlyfrom the entire infinite sequence, with all digits equally likely,doesn't actually make sense because you can't have a uniformprobability distribution on an infinite series of numbers. It wouldlead to paradoxes along the lines of the two-envelope paradoxdiscussed at http://consc.net/papers/envelope.html except in thisvariant you'd be given one of two envelopes which you find to containN dollars, where N was chosen at random from the infinite series ofnatural numbers 1,2,3,... using a uniform probability distribution soeach natural number was equally likely. Then if you have a choice toexchange it for another sealed envelope chosen in the same way, youshould always bet that the second envelope contains more money withprobability 1 since there are an infinite number of possible Ns largerthan the one you got and only a finite number of Ns smaller. Theparadox is that this argument would seem to work even before you haveopened the first envelope and seen the specific value of N inside, soyou're saying that there's a probability 1 that one of two identicalfeatureless sealed envelopes has more money in it than the other!)

`There's a solution to the two-envelope paradox using a distribution over`

`the infinite range of possible values and so applies to the above form`

`also. You start by assuming arbitrary distribution functions and then`

`show that if density function is assumed to be uniform (0,inf) the`

`paradox goes away. This is realistic, since in any actual realization`

`you would have some idea of the upper bound on the distribution and if`

`you opened and envelope with this amount you wouldn't swap - the`

`paradoxical symmetry depends on the assumption of an unbounded range.`

`Here's the solution. It is for a generalixed form of the two-envelope`

`puzzle in which the larger amount is r times as big as the smaller`

`amount. In the end the solution is independent of the value of r, so`

`the solution also applies to the form you cite above.`

`Without loss of generality, we can describe our prior density functions`

`for the amounts in the two envelopes in terms of a density function,`

`fo(x), the ratio r of the larger amount to the smaller, and a scale`

`factor, k. Let L be the event that the evelope with the larger amount`

`is picked and S the event that the envelope with the smaller amount if`

`picked. Then our prior density functions for the amount m in the`

`envelope is:`

For the smaller amount our prior is: f(m|S k) = k fo(km) and for the larger amount: f(m|L k) = (k/r) fo(km/r)

`Our uncertainity about the scale factor, k, is described by a density`

`g(k). So`

`f(m|S) = INT k fo(km) g(k) dk ,where INT is integral`

`zero-to-infinity`

f(m|L) = INT (k/r) fo(km/r) g(k) dk Now in the first equation make a change of variable in the integral by y=km f(m|S) = INT (y/m) fo(y) g(y/m) dy/m = (1/m^2) INT y fo(y) g(y/m) dy and in the second change the variable of integration by x=km/r

`f(m|L) = INT (x/m) fo(x) g(rx/m) (r/m) dx = (r/m^2) INT x fo(x)`

`g(rx/m) dx`

`Now if we assume no prior knowledge of the scale of the amounts, we will`

`take g(k) to be a flat (improper) density and the two integrals will be`

`equal; whence`

f(m|L)/f(m|S) = r But, by Bayes f(m|L) = P(L|m) f(m)/P(L) so P(L|m) = f(m|L) P(L)/ f(m) = f(m|L) P(L)/[f(m|L) P(L) + f(m|S) P(S)]

`using P(L) = P(S), i.e. equal prior probability of selecting the larger`

`or the smaller`

P(L|m) = f(m|L)/[f(m|L) + f(m|S)] Then dividing numerator and denominator by f(m|S) P(L|m) = r/[r + 1] and P(S|m) = 1 - P(L|m) = 1/[r + 1] So the expected value of switching is <switch> = P(L|m)m/r + P(S|m)rm = [r/(r+1)]m/r + [1/(r+1)]rm =m

`which is the same as not switching and keeping the amount m found in the`

`first envelope; so there is no paradox. Note that if (as would be the`

`case in a real instance) we do suppose we know something about the scale`

`of the amounts, i.e. our prior for g(k) is not actually flat, then we`

`will expect a gain from switching if we see an amount m that is toward`

`the low end of our prior and we will not expect a gain if the amount we`

`see is high. We do not have the paradox of wanting to switch even`

`before we see the amount in the first envelope selected.`

If this notion of considering the frequency of different finitesequences in an infinite sequence is a well-defined one, perhapssomething similar could also be applied to an infinite spacetime andthe frequency of Boltzmann brains vs. ordinary observers, although themathematical definition would presumably be more tricky. You couldconsider finite-sized chunks of spacetime, or finite-sized spinnetworks or something in quantum gravity, and then look at therelative frequency of all the ones of a given "size" large enough tocontain macroscopic observers. Suppose you knew the frequency F1 of"chunks" that appeared to be part of the early history of a babyuniverse, with entropy proceeding from lower on one end to higher onthe other end, vs. the frequency F2 of "chunks" that seem to be partof a de Sitter space that had high entropy on both ends. Then if youcould also estimate the average number N1 of ordinary observers thatwould be found in a chunk of the first type, and the average number N2of Boltzmann brains that would be found spontaneously arising in achunk of the second type, then if F1*N1 was much greater than F2*N2you'd have a justification for saying that a typical observer is muchmore likely to be an ordinary one than a Boltzmann brain.Jesse

`Right. You hope to find some relative frequency of generation based on`

`the physics and that gives you a probability measure. You can't use the`

`infinite cardinality as a measure.`

Brent -- You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to everything-l...@googlegroups.com. To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.