Joel:
>> It seems to me there is a great deal more information in PI than
>> just the 2 bytes it takes to convey it in an email message.

##
Advertising

Russell:
> Not much more. One could express pi by a short program - eg the
> Wallis formula, that would be a few tens of bytes on most Turing
> machines. Even expressing it as a pattern on your beloved CA, it
> would probably not consume more that a few hundred bytes.
Yes, I see. Juergen pointed this out too, and I think it's a valid point to
make the distinction between different representations of the same
mathematical object. You are both correct - Pi can in fact be represented
nicely (as a program) in a finite way.
But I don't dispute this, as I wasn't talking about the finite
representation. I was talking about the infinite process / function that pi
represents.
Maybe this is obvious, but my whole point is that we are fooling ourselves
if we think we can compute physics using expressions that consume infinite
resources (memory, or computing time). Yes, I understand that the universe
as a whole may grow without bound (infinite history), but at any given
moment, it must be a finite size. Otherwise we can't compute it!
For example, if somehow the universe requires computations like the
following:
x = 0
do
x = x + pi()
print x
loop
Then we are doomed. We cannot run this kind of program. Yes, I know we can
find a finite representation like this:
x = 0
do
x = x + 1
print x; " pi"
loop
But does this REALLY make use of the details of pi? I don't think so.
I'm simply trying to get people to confront the truth that we humans are
incapable of devising Theories of Everything that are NOT run on a universal
computer. That's all.
Many will say, "Of course! We know that!".
And then they go on, as if nothing happened, talking about the probabilities
of items in infinite sets, and "independent tosses of a fair coin", and
"quantum indeterminacy", and "the continuum of the real numbers", as if
these things exist!
If we cannot program it... it's not a Theory of EVERYTHING. It's just a
description.
Let us take the realist approach and focus on the things we can actually
compute fully.
Joel