# Re: The past hypothesis

```On 5/1/2010 3:17 PM, Rex Allen wrote:
```
```On Sat, May 1, 2010 at 4:08 PM, Brent Meeker<meeke...@dslextreme.com>  wrote:
```
```Seems like a good answer to me.  Suppose there were infinitely many rolls of
a die (which frequentist statisticians assume all the time).  The fact that
the number of "1"s would be countably infinite and the number of "not-1"s
would be countably infinite would change the fact that the "not-1"s are five
times more probable.
```
```So let's say that we have an infinitely long array of identically
sized squares.  Inside each square a single number is written, from 1
to 6.```
```
First let's say that the numbered squares just repeat:  1, 2, 3, 4, 5,
6, 1, 2, 3, 4, 5, 6, 1, 2, 3, 4, 5, 6...over and over, infinitely many
times.

Now, we randomly throw a dart at this infinitely long row of squares.
Should we expect to hit a 1, or not-1?  Not-1, right?  Because we have
extra information about the internal structure of the infinitely long
row.  The dart has to hit in some finite space, and the layout of the
numbers in the squares for any given finite space is known.  So the
probability of hitting a "1" is "1 in 6".

NOW.

Let's say the ordering of the numbers in the squares is completely
random.  We've lost information here.  When we throw the dart at the
row, we have no idea what numbers will be in the randomly selected
finite area we aim towards.

In an infinite sequence, any given finite sequence will appear
infinitely often...so there are stretches as large as you want to
specify that contain only 1s or only not-1s.

Further more, as you say, the 1's and not-1's can be put into a
one-to-one correspondence...both sets are countably infinite.  There
are as many "1's" as "not-1's".  And there are as many 2's as
"not-2's" and so on.

So, we lost a lot of information there when we abandoned the strictly
repeating structure.  Before we lost that information, we could safely
say that the probability of hitting a 1 was "1 in 6"...but after
losing that information surely we can't say anything at all about the
probability of hitting a "1" with our dart.
```
```
```
Sure we can, because part of the meaning of "random", the very thing that lost us the information, includes each square having the same measure for being one of the numbers. If, for example, we said let all the "1"s come first - in which case we can't hit any "not-1"s, that would be inconsistent with saying we didn't have any information.
```
```
```"Whereas the interpretation of quantum mechanics has only been
puzzling us for ∼75 years, the interpretation of probability has been
doing so for more than 300 years [16, 17]. Poincare [18] (p. 186)
described probability as "an obscure instinct". In the century that
has elapsed since then philosophers have worked hard to lessen the
obscurity. However, the result has not been to arrive at any
consensus. Instead, we have a number of competing schools (for an
overview see Gillies [19], von Plato [20], Sklar [21, 22] and Guttman
[23])." (http://arxiv.org/PS_cache/quant-ph/pdf/0402/0402015v1.pdf)

```
My personal view is the probability is a mathematical tool something like linear algebra. It's useful precisely because it has different interpretations. Here's the introductory paragraph I wrote for a course for engineers I taught years ago. If you'd like can send you the rest of the hand-out off-line:
```
```
Probability has several different meanings and philosophers argue over them as if one must settle on the /*real*/ meaning. But this is a mistake. Just like “cost” or “energy”, “probability” is useful precisely because the same value has different interpretations. There are four interpretations that commonly come up.
```

1.

It has a mathematical definition that lets us manipulate it and
draw inferences.

2.

It has a physical interpretation as a symmetry.

3.

It quantifies a degree of belief that tells us whether to act on it.

4.

It has an empirical meaning that lets us measure it.

```
The usefulness of probability is that we can start with one of these, we can then manipulate it mathematically, and then interpret the result in one of the other ways. For example, you might observe that dice are perfectly cubical and uniform and so by (2) each face should be equally probable, i.e. P=1/6. Then you could calculate, using (1), that there are three ways of rolling a 4, . .:, : :, and .: . , out of a total of 36 possible outcomes. So the probability of a 4 on a throw is 3/36=1/12. Which tells you to only bet (3) on making a point of 4 at 12-to-1 or better odds. If you watch many game of craps and tally the results, you can approximately confirm the relative fraction of times 4 comes up (4).
```
Brent

--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to