On Wed, Feb 24, 2010 at 1:02 AM, Charles <charlesrobertgood...@gmail.com>wrote:

> The point about amplification is that all normal detection events
> require amplification, such as photographic film, photomultipliers and
> so on. We never detect a quantum event directly, but rather the result
> of that event having caused some sort of cascade process. My point was
> that it would be impossible to arrange for this to occur in the time-
> reversed case.
> However, the comment about amplification was in response to a previous
> question, and was probably in any case a side issue. The original
> question was why we can't use this approach to send messages into the
> past (IIRC), and the fact that detections involve amplification and
> that equivalent processes can't be arranged in a past-directed time
> sense is, although true, not the fundamental reason, which is
> concerned with what it would *mean* to detect a message being sent
> into the past. (We would have to somehow read the state of a photon
> that is about to be emitted, for example.)
> Price's idea (if I undersand him correctly) is that the behaviour of a
> particle is constrained by both path and future boundary conditions,
> hence my use of "significant". In a broader sense, all quantum
> interactions are (according to Price - if I understand him correctly -
> please take that as read in future comments!) equally constrained by
> the past and future, but we live in a universe in which (for some
> reason) there is an entropy gradient. Time symmetry is pervasive
> within the network of interactions between the particles making up the
> world, but isn't obviously manifested on any other scale, where the
> entropy gradient dominates (except as puzzling results in quantum
> measurements).
> If you view the universe as a large Feynman diagram, then each branch
> of it should be equally constrained by the event at its past and
> future end points. However, superimposed on top of this is an entropy
> gradient that puts the pastward end of all the trajectories into a
> very special configuration that we describe as low entropy. Hence the
> time-symmetry is quite hard to see from out perspective, as creatures
> made up of processes that are all oriented along the entropy gradient.

Yes, this is the mainstream point of view, not unique to Price. It's
generally thought that reason we see an arrow of time at the macroscopic
level--including the arrow of time inherent in the fact that we can look at
records in the present and gain knowledge of past events, but we can't do
the same for future events--is ultimately explained by the low-entropy
boundary condition at the Big Bang. In a deterministic universe, information
about the future actually would be implicit in the total distribution of
matter/energy and the present, but the problem would be that the relation
between future events and present information about them would be a
one-to-many relationship--you'd basically have to know the precise position
and velocity of every single particle in the past light cone of a future
event in order to reconstruct what that future event would actually be.
Because of the entropy gradient, with past events and present records you
can have a one-to-one relationship (or at least a one-to-few relationship),
where localized collections of particles can function as records of past

The problem I was alluding to had to do with the fact that Price is arguing
for "retrocausation" not just in the broad sense of any arbitrary
time-symmetric theory, where the entire distribution of particles in the
past light cone of some future event can be said to contain information
about that future event (and thus to 'anticipate' it in a sense), but in a
more narrow one-to-one sense. He's saying that the hidden variables states
of just *two* entangled particles will depend in a lawlike way on the future
measurements performed on these particles. If these variables weren't
hidden--if you could actually know the hidden-variables states of particles
before they were measured--then you could use them to know in advance what
measurement was going to be performed in the future. And the experimenters
could base their decision on what experiment to perform on the outcome of
some complicated future event involving many particles (say, a horse race!),
so in a sense you can even have a many-to-one relationship between future
events and present "records" of these events in the form of hidden-variables
states for individual pairs of particles.

A priori, if you didn't know anything about QM, it would seem natural to
expect that any theory involving "retrocausation" in this narrow sense,
where a very small number of variables in the present can be perfectly
correlated with the outcome of arbitrary future events like horse races,
would probably allow you to send messages into the past. So it seems odd,
then, that nature "conspires" to prevent this by making it impossible in
principle to know the simultaneous states of this small collection of hidden
variables that are imagined to exist by the retrocausal theory. It seems
only a very limited set of all possible retrocausal theories would have this
property of "censoring" fortune-telling in this way, and given that no one
has even found a general hidden-variables theory of this type that matches
up with the predictions of QM, it may well be that any such theory would end
up looking very contrived.

My point about the delayed choice quantum eraser was similar. For an
enthusiast of the narrow type of one-to-one retrocausation, it might seem
natural to describe this experiment by saying that whether or not the signal
photons form an interference pattern depends on future facts about whether
the idler photons will have their which-path information preserved (so we
can deduce which slit each signal photon went through, in which case the
complementarity principle of QM says there should be no interference
pattern) or whether their which-path information ends up getting erased (so
we don't know which slit the signal photons went through, in which case the
complementarity principle says there should be an interference pattern).
Superficially, then, one might expect that by examining the pattern of
signal photons in the present we can know in advance whether the idlers will
have their which-path information erased (and the decision whether or not to
erase might be based on the outcome of some other event--a horse race
again!) But then if you look at the fine print, you realize that in the
delayed choice quantum eraser experiment the *total* pattern of signal
photons never shows any interference pattern, regardless of what happens to
the idlers. In the case where the idlers have their which-path information
erased, half the idlers go to one path-erasing detector and half go to
another, labeled D1 and D2 in the wikipedia article...if you look at the
*subset* of signal photons whose idlers went to D1 you see an interference
pattern, likewise if you look at the subset whose idlers went to D2, but the
peaks of one interference pattern line up with the valleys of the other and
vice versa, so when you add these two subsets to look at the total pattern
of signal photons, the interference cancels out. And since we can't control
which of those two detectors each idler goes to, there's no way to know in
advance which signal photons will fall into which subset, so there's no way
to find any interference pattern in the signal photons until after the
idlers have been detected at a which-path-erasing detector. Once again,
nature seems to go to a great deal of trouble to avoid the possibility of
gaining information about the future through these kinds of experiments,
which is odd if you believe there is really some hidden one-to-one
retrocausation going on, but not so odd if you believe in an interpretation
of QM involving no such one-to-one retrocausation such as Bohmian mechanics
or the many-worlds interpretation.

> > Isn't his explanation for the arrow of time the same as most other
> > physicists, namely the idea that it should be explained in terms of the
> low
> > entropy of the Big Bang?
> Yes, but he is interpreting this in a slightly different way to how it
> is normally perceived. Many attempts to explain the "arrow of
> time" (at least many attempts reported in "New Scinetist", which seems
> to go in for them regularly) involve in some sense "local conditions",
> in particular something inherent in the laws of physics. Price is
> suggesting that there are no such local conditions, and that the
> initial low entropy is sufficient to explain everything about the
> arrow of time. This moves the explanatory burden, of course.
> Sorry, that seems rather obvious - did I miss the point of the
> question?
> I think you may have gotten the wrong idea from those New Scientist
articles, which is understandable since New Scientist often adopts a
misleadingly sensationalist tone. It's true that there are some violations
of T-symmetry in microscopic fundamental laws, but I don't think many
physicists who have thought about the arrow of time problem claim that there
is likely to be any connection between these microscopic violations and the
arrow of time we see at the macroscopic level (all the physicists I've read
discussing the issue have dismissed the idea of such a connection, anyway),
the macroscopic arrow generally is assumed to be solely due to
thermodynamics (and thus ultimately trace back to the low entropy of the Big
Bang). One reason for this is just because microscopic violations of
T-symmetry only involve fairly exotic particles, whereas the interactions of
particles that make up everyday objects are T-symmetric ones. And I think
another more subtle reason physicists don't see a connection as likely is
because while quantum field theory allows violations of T-symmetry, it's
been proven that any relativistic quantum field theory should obey a larger
symmetry known as CPT-symmetry, which leads to the same sort of
*thermodynamic* conclusions as T-symmetry. To compare the two, T-symmetry
says that if you record the evolution of any physical system, the laws of
physics should always theoretically permit you to construct a second
physical system whose evolution forward in time would look exactly like a
movie of the first system played backwards. CPT-symmetry, on the other hand,
says that if you record the evolution of a system (or more accurately, the
evolution of the system's wavefunction), it should be possible to construct
a second system whose forward evolution would look like a movie of the first
system which was 1) played backwards, 2) parity-flipped, meaning you take
the mirror image of the system along all three spatial axes, and 3)
charge-flipped so that each particle is exchanged for its own antiparticle
(protons for antiprotons and so forth). But thermodynamically, there's no
reason it should seem any less unlikely that a mirror-image flipped system
of antiparticles would spontaneously decrease in entropy as it evolves
forward in time than it should be for the original system--in both cases,
only very special choices of initial conditions should result in entropy
decreasing in an isolated system. So, we're still left with the conclusion
that the only way to explain why the second law works in the forward
direction but not in reverse is because of very special low-entropy boundary
conditions at the Big Bang.

For an example of a physicist dismissing the idea that violations of
T-symmetry which still respect CPT-symmetry are likely to have anything to
do with the thermodynamic arrow of time, check out note 2 for chapter 1 of
Brian Greene's book "Fabric of the Cosmos" which you can read on google
books here:


And he goes into more detail on this in note 2 of chapter 6 on pages
506-507...unfortunately p. 507 is not available on google books, but you can
read the part on p. 506 here:


I won't type up the full remainder of the note from p. 507 but here's the
most relevant part:

"a general theorem shows that quantum field theories (so long as they are
local, unitary, and Lorentz invariant--which are the ones of interest) are
always symmetric under the combined operations of charge conjugation C
(which replaces particles by their antiparticles), parity P (which inverts
positions through the origin), and a bare-bones time-reversal operator T
(which replaces t by -t) ... There are certain particle species (K-mesons,
B-mesons) whose repertoire of behaviors is CPT invariant but is not
invariant under T alone ... T symmetry violation has been directly
established by the CPLEAR experiment at CERN and the KTEV experiment at
Fermilab. Roughly speaking, these experiments show that if you were
presented with a film of the recorded process involving these meson
particles, you'd be able to determine whether the film was being projected
in the correct forward time direction, or in reverse. In other words, these
particular particles can distinguish between past and future. What remains
unclear, though, is whether this has any relevance for the arrow of time we
experience in everyday contexts. After all, these are exotic particles that
can be produced for fleeting moments in high-energy collisions, but they are
not a constituent of familiar material objects. To many physicists,
including me, it seems unlikely that the time nonreversal invariance
evidenced by these particles plays a role in answering the puzzle of time's
arrow, so we shall not discuss this exceptional example further. But the
truth is that no one knows for sure."

Similarly, Stephen Hawking also brought up CPT invariance in the chapter on
the arrow of time in "A Brief History of Time" (which can be read online at
it's chapter 9), but he didn't even bother to mention the occasional
violations of T-symmetry when trying to explain the origin of the arrow of
time, he just said on p. 144 that the laws of physics governing matter
"under all normal situations" are invariant under CP-reversal and therefore
are invariant under T-reversal as well. The rest of the chapter discusses
the origin of the arrow of time, and as in Greene's book he places the blame
solely on the low entropy of the Big Bang. Roger Penrose also devotes
chapter 7 of his book "The Emperor's New Mind" to the topic of "Cosmology
and the Arrow of Time" (parts of which can be viewed at
and likewise concludes the arrow is entirely due to a low-entropy Big
Bang. Finally, another new book by a physicist discussing the arrow of time
is Sean Carroll's "From Eternity to Here"--I haven't read it yet, and it
can't be previewed on google books, but I think judging from the FAQ at
http://preposterousuniverse.com/eternitytohere/faq.html it's apparent that
he also doesn't think these violations of T-symmetry in particle physics are
relevant to the macroscopic arrow of time:

"Don't the weak interactions violate time-reversal invariance?
Not exactly; more precisely, it depends on definitions, and the relevant
fact is that the weak interactions have nothing to do with the arrow of
time. They are not invariant under the T (time reversal) operation of
quantum field theory, as has been experimentally verified in the decay of
the neutral kaon. (The experiments found CP violation, which by the CPT
theorem implies T violation.) But as far as thermodynamics is concerned,
it's CPT invariance that matters, not T invariance. For every solution to
the equations of motion, there is exactly one time-reversed solution -- it
just happens to also involve a parity inversion and an exchange of particles
with antiparticles. CP violation cannot explain the Second Law of

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to