On Monday, April 22, 2002, at 11:23 PM, Joseph Ashwood wrote:
>
> From: <[EMAIL PROTECTED]>
>> If a RNG runs off Johnson noise, then the ability to predict its
>> output would imply the ability to violate the second law of
>> thermodynamics. If it runs off shot noise, then the ability to
>> predict its output would disprove quantum mechanics.
>
> Actually there are models that fit the universe that are entirely
> deterministic.
Could you mention what they are?
Boehm's "hidden variables" model is generally discredited (some would
say "disproved"). Alternatives to the Copenhagen Interpretation, notably
EWG/"many worlds," Hartle's "consistent histories," and Cramer's
transactional model, are still not deterministic, in that the world an
observer is in ("finds himself in") is still not predictable in advance.
Operationally, all interpretations give the same results, i.e., the
Uncertainty Principle. (Which is why I mentioned "hidden variables," the
only alternative theory which _might_ have restored classical
Lagrange/Laplace predictability, in theory.)
And even if the world were Newtonian, in a classical billiard ball
sense, with Planck's constant precisely equal to zero, predictability is
a chimera. Consider a game of billiards, with perfectly spherical
billiard balls, a perfectly flat table, etc. Trajectories depend on
angles to a precision that keeps going deeper and deeper into the
decimals. For example, predicting the table state after, say, 3 seconds,
might require knowing positions, speeds, and angles (in other words, the
vectors) to a precision of one part in a thousand. Doable, one might say.
But after 30 seconds, any "errors" that are greater than one part in a
billion would lead to "substantially different" table states. Fail to
know the mass or position or elasticity or whatever of just one of the
billiard balls to one part in a billion and the outcome is no longer
"predictable."
After a couple of minutes, the table positions are different if anything
is not known to one part in, say, 10^50.
Even talk of a "sufficiently powerful computer" is meaningless when the
dimensions of objects must be known to better than the Planck-Wheeler
scale (even ignoring issues of whether there's a quantum foam at these
dimensions).
I feel strongly about this issue, and have thought about it for many
years. The whole "in principle the Universe can be calculated" was a
foray down a doomed path WHETHER OR NOT quantum mechanics ever came out.
The modern name for this outlook is "chaos theory," but I believe
"chaos" gives almost mystical associations to something which is really
quite understandable: divergences in decimal expansions.
Discrepancies come marching in, fairly rapidly, from "out there in the
expansion."
Another way of looking at unpredictabality is to say that real objects
in real space and subject to real forces from many other real objects
are in the world of the "real numbers" and any representation of a real
number as a 25-digit number (diameter of the solar system to within 1
centimeter) or even as a 100-digit number (utterly beyond all hope of
meaurement!) is just not enough.
(Obscure aside: What if Wheeler and others are right in some of their
speculations that the universe is ultimately quantized, a la "It from
Bit"? Notwithstanding that we're now back into quantum realms and
theories, even if the universe were quantized at Planck-scale levels,
e.g., at 10^-33 cm levels, the gravitational pull of a distant galaxy
would be out somewhere at the several hundredth decimal place, as a wild
guess. The point being, even a grid-based positional system would yield
the same "real number" issues.)
In short, predictability is a physical and computational chimera: it
does not, and cannot, exist.
> Admittedly the current line of thinking is that entropy
> exists, but there we still have not proven that it must exist.
Just as no sequence can ever be proved to be "random," so, too, one can
never prove that entropy "exists" except in the definition sense.
Our best definition of what we mean by random is that of a thing
(sequence, for example) with no shorter description than itself. This is
the Kolmogorov-Chaitin definition. Much more is available on the Web
about this, which is generally known as "algorithmic information theory."
--Tim May