Tim May wrote:

> Boehm's "hidden variables" model is generally discredited (some would
> say "disproved"). Alternatives to the Copenhagen Interpretation, notably
> EWG/"many worlds," Hartle's "consistent histories," and Cramer's
> transactional model, are still not deterministic, in that the world an
> observer is in ("finds himself in") is still not predictable in advance.
> Operationally, all interpretations give the same results, i.e., the
> Uncertainty Principle. (Which is why I mentioned "hidden variables," the
> only alternative theory which _might_ have restored classical
> Lagrange/Laplace predictability, in theory.)

For no other reason than namedropping I want to point out that Bohm
worked out much of this theory in the building I'm in now - Birkbeck
College gave him a job after he'd been more or less forced out of the
USA by HUAAC.  I exchanged a few words with one of his ex-collaborators
(in the theory, not the Party) in the lift (elevator) a couple of hours
ago. About the soup, not about physics. I can't claim to understand the
physics.

As far as I know (not far, because I can only follow the English
descriptions, not the maths) Bohm's "hidden variables" never claimed to
make the universe predictable, even if deterministic. What they were
after is trying to find some way of describing what they thought was
really going on, instead of what we can observe. (& what they thought
was really going on seems to end up at God, more or less). And one of
the criticisms of it was that even if it were true (some claimed) it
would be impossible to tell experimentally from non-hidden-variable
descriptions. Which might come to the same thing as saying ">
Operationally, all interpretations give the same results" (I tend to be
wary of sentences that start with the word "operationally")

Back nearer to on-topic, Tim's explanation why the world could not be
predicted even if it were locally (microscopically) predictable sounds
spot-on. I may not know much about physics but I know enough about
biology, and the mathematical modelling of biology,  to see how big the
numbers get and how quickly. In a billiard-ball universe we couldn't
predict the exact behaviour of a medium-sized protein molecule, never
mind a whole cell. The world is too complex and detailed to predict. We
hit the combinatorial explosion almost as soon as we set out. I'm
working on some individually-based models of some stylised biochemical
pathways in model microorganisms. The simplest inquiries smash into
impossible numbers at resolutions many orders of magnitude above
anything that quantum considerations are relevant to.


> And even if the world were Newtonian, in a classical billiard ball
> sense, with Planck's constant precisely equal to zero, predictability is
> a chimera. Consider a game of billiards, with perfectly spherical
> billiard balls, a perfectly flat table, etc. Trajectories depend on
> angles to a precision that keeps going deeper and deeper into the
> decimals. For example, predicting the table state after, say, 3 seconds,
> might require knowing positions, speeds, and angles (in other words, the
> vectors) to a precision of one part in a thousand. Doable, one might say.
> 
> But after 30 seconds, any "errors" that are greater than one part in a
> billion would lead to "substantially different" table states. Fail to
> know the mass or position or elasticity or whatever of just one of the
> billiard balls to one part in a billion and the outcome is no longer
> "predictable."
> 
> After a couple of minutes, the table positions are different if anything
> is not known to one part in, say, 10^50.
> 
> Even talk of a "sufficiently powerful computer" is meaningless when the
> dimensions of objects must be known to better than the Planck-Wheeler
> scale (even ignoring issues of whether there's a quantum foam at these
> dimensions).
> 
> I feel strongly about this issue, and have thought about it for many
> years. The whole "in principle the Universe can be calculated" was a
> foray down a doomed path WHETHER OR NOT quantum mechanics ever came out.
> 
> The modern name for this outlook is "chaos theory," but I believe
> "chaos" gives almost mystical associations to something which is really
> quite understandable: divergences in decimal expansions.
> 
> Discrepancies come marching in, fairly rapidly, from "out there in the
> expansion."
> 
> Another way of looking at unpredictabality is to say that real objects
> in real space and subject to real forces from many other real objects
> are in the world of the "real numbers" and any representation of a real
> number as a 25-digit number (diameter of the solar system to within 1
> centimeter) or even as a 100-digit number (utterly beyond all hope of
> meaurement!) is just not enough.
> 
> (Obscure aside: What if Wheeler and others are right in some of their
> speculations that the universe is ultimately quantized, a la "It from
> Bit"? Notwithstanding that we're now back into quantum realms and
> theories, even if the universe were quantized at Planck-scale levels,
> e.g., at 10^-33 cm levels, the gravitational pull of a distant galaxy
> would be out somewhere at the several hundredth decimal place, as a wild
> guess. The point being, even a grid-based positional system would yield
> the same "real number" issues.)
> 
> In short, predictability is a physical and computational chimera: it
> does not, and cannot, exist.
> 
> > Admittedly the current line of thinking is that entropy
> > exists, but there we still have not proven that it must exist.
> 
> Just as no sequence can ever be proved to be "random," so, too, one can
> never prove that entropy "exists" except in the definition sense.
> 
> Our best definition of what we mean by random is that of a thing
> (sequence, for example) with no shorter description than itself. This is
> the Kolmogorov-Chaitin definition. Much more is available on the Web
> about this, which is generally known as "algorithmic information theory."
> 
> --Tim May

Reply via email to