Thanks, Eric, for taking the question seriously.  I will study your answer
with care. 

All the best, 

Nick 

-----Original Message-----
From: friam-boun...@redfish.com [mailto:friam-boun...@redfish.com] On Behalf
Of Eric Smith
Sent: Thursday, June 30, 2011 8:35 AM
To: The Friday Morning Applied Complexity Coffee Group
Subject: Re: [FRIAM] symmetry breaking

Nick, hi,

This time I really, really am under the gun and have no business answering.
But you are not being foolish.  You are pushing correctly on a set of
statements that are not a principle.  As Steve and Peter and others have
said, the only way to properly handle this is actually to work out a full
solution, and then figure out what global properties the particular solution
is expressing, and I haven't done that, so I can't provide anything that
counts as an adequate answer.
But a couple of points that I think are relevant.

The intuition you are pushing:  Water up high has gravitational potential
energy.  That by itself doesn't get turned into heat, so it doesn't directly
dissipate.  If the water can fall, the potential can be converted to kinetic
energy, and from there, various frictional effects can indeed convert the
bulk-motion of the water to thermal motion.  This is why you would be
looking for phenomena that both speed the rate of fall, but then afterward
also speed the slowing of average motion through frictions to disorderly
motion.  Internal turbulence etc. are all good pathways through which that
can happen.

On use of dissipation:  I think that, to the extent that it ever means
anything in these conversations, "dissipation" means the conversion of
energy from a mechanical form into a thermal form that goes into the
entropy.  I will say in a moment why that usage in most conversations from
Schroedinger and Brillouin onward, through Prigogine, are not reliable.  But
at least that statement is precise enough that it can be falsified.  And
sometimes it is okay; it's just that those sometimes are case-dependent.

On other factors such as constraints:  Yes, all the conversation about
stirring has to do with the role of angular momentum, as well as potential
energy, as a constraint on the configurations available to the system, and
to the processes through which they can dissipate or do anything else.  Any
relaxation process, however described, will be constrained by whatever
factors are put into its initial conditions, so inequivalent initial
conditions need not lead to directly comparable downstream phenomena.
Angular momentum that you put in by initial stirring must be transferred to
the sink or the air, before the fluid can slow enough to reach the drain,
and that takes time, because the transfer of angular momentum is mediated by
boundary-layer dynamics, sink-shape effects, etc., just like everything
else. 

On the reason the standard use of "dissipation" doesn't get you to
principles:  When people say "the entropy", they usually mean the function
which -- IN AN EQUILIBRIUM ENSEMBLE -- would be the proper measure of
uncertainty or distribution of energy among degrees of freedom.  All these
flow problems that we talk about are not described by equilibrium ensembles;
they are ensembles of processes.  Of course, everybody says that, but
apparently most of the time people don't act as if saying that should then
carry meaning for what they think afterward.  (Like other mantras, its
function appears to be to suppress pre-frontal cortex activity.)  A
processes with states and flows has more that you can know about it than a
process with the same states but without flows.  This means that its
uncertainty or distribution are more constrained -- because they are
constrained by additional quantities -- than the equilibrium counterparts.
Therefore, the function that is the entropy for an equilibrium ensemble will
no longer generally be the correct measure of uncertainty or distribution
for an ensemble of processes.  The CONCEPT of entropy is still fine, and
even the Shannon/Boltzmann definition in terms of probabilities can still be
fine.  But the probabilities come to be defined on spaces of histories
rather than merely of states.
The entropy function that then results from the Shannon/Boltzmann
construction is then not generally the same function that would arise for
any equilibrium ensemble.  Likewise, the ways that constraints act over the
course of histories, and the expression of that action in terms of
_functional_ gradients of this new entropy function, will generally be new
functional forms.  In a limited set of cases, the
starting- and ending-condition equilibrium ensemles are restrictive enough
that one can get away without an explicit model of the dynamics, and then
"dissipation" expressed in terms of the equilibrium entropy may predict
correct descriptions.  But in other cases, while the asymptotic beginning
and ending configurations will continue to place bounds on what can happen,
those bounds may be so loose that they are not directly informative, and one
will be forced to go to an entropy function that is influenced by the
dynamics, to identify what the true constraints are.  To my knowledge, no
systematic prescription exists to determine which systems will require a
proper treatment, and which will be tractable with the "entropy-production"
approximations a la Prigogine.  But I do find it a shame that the relentless
publicity machines have so taken over the world that everyone seems to
follow Prigogine-speak (which was also Schroedinger-speak and
Brillouin-speak and Henry-Quastler-speak, etc.), when the use of entropies
on histories is common in the Markov chain literature, and was thoroughly
(terrifyingly) understood by Kolmogorov and his students in the 1950s, and
re-iterated by E. T. Jaynes in the 1980s and later (though I am not sure how
much on Caliber is in Jaynes's textbook).  However, since you originally
sent the Ken Dill et al. papers on using Caliber for the two-state system, I
know you have good sources that go back to the originals.

bwt, I owe you a debt of thanks for the Dill papers.  I have a review out
recently in Reports of Progress in Physics, entitled Large-deviations
principles, ... path entropies ..., which only treats the discrete two-state
system, but at least addresses some of these points, and shows how they all
fit together.  Your raising the Dill papers prompted me to write this, so I
could understand the relations.
It's all meant as review material and doesn't contain anything deep or new,
but perhaps does better faith to giving an answer than this email did.  

If I had the possibility to analyse this system, which I have wanted to do
for many years, I would try to form the full path ensemble, calculate its
entropy, and see whether it decomposes into anything simpler.  For now, the
complexity of fluid dynamics and measures on continuous spaces has put me
off, and I have stayed with simpler discrete-state, discrete-time processes,
where one can learn what these ideas look like, with minimal overhead making
sure that the measures have been correctly constructed.  

Of course, this is not an answer, but I hope it contributes something to
comfort in putting the question.  

All best,

Eric


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College lectures, archives,
unsubscribe, maps at http://www.friam.org


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

Reply via email to