On 2/20/2012 10:33 AM, Evgenii Rudnyi wrote:
On 19.02.2012 22:13 Russell Standish said the following:
On Sun, Feb 19, 2012 at 11:21:01AM -0500, John Clark wrote:
On Sun, Feb 19, 2012 at 5:32 AM, Evgenii Rudnyi<use...@rudnyi.ru>

If one well defines a thought experiment with the Maxwell's
demon, then
it is quite clear that such thing does not exist. Why then to
spend on it so much time?

Maxwell's demon is possible in classical physics and it was not
clear that quantum mechanics made it impossible until 1929 when Leo
Szilard proved that to be the case. And understanding just why it
can not exist aids in understanding the relationship between energy
information entropy and reversibility. Maxwell's demon was the
starting point for Rolf Landauer's discovery in 1960 that erasing
information always requires energy and increases entropy because
it's thermodynamically irreversible.

Good answer John. Does anyone want to pick on Evgeni's comments
about Chris Adami's book?

It weird, because Chris's book gives some of the bext examples of
the application of statistical physics to artificial life. In
particular, his observation that mutation should play an analogous
role to temperature in an evolutionary process, and that several
evolutionary regimes exist as mutation is varied, corresponding to
phase transitions in materials.

I have nothing against Adami's book as such. His description of his software avida and his experiments with it are okay. My point was about his claim that his work has something to do with thermodynamics. It is definitely not. The thermodynamic entropy is not there. The quotes from the book displays this pretty clear.

You have written about "an analogous role". I would not object if you say that there is an analogy between the thermodynamic entropy and information. Yet, I am against the statement that the thermodynamic entropy is information and I believe that I have given many examples that show this.

What you are overlooking is that information is *about* things. So entropy in thermodynamics is information about the system's location in phase space. That's what connects "information" and "work" and "temperature". Entropy in communication theory is about the location of a message in message space. It's a different application of the same concept. The two overlap when considering the minimum free energy requirements of a physical realization of a computation - but existing computers operate far above those minimums so the overlap is only of theoretical interest.

Thermodynamic entropy is not subjective and not context dependent*, so my claim is that Adami does not understand what the thermodynamic entropy is. He has never taken a class in experimental thermodynamics, this is the problem.

I'm beginning to think you have never taken a class in statistical mechanics. There's a good online course here:


Those particularly relevant to this thread start at


and go through the next six or seven.


* I would accept the notation that the entropy is context dependent in a sense that its definition depends on the thermodynamics theory. If we change the theory, then the entropy could have some other meaning. But it seems not what you have meant.


This phenomena I have observed in my own evolutionary experiments.
Plus, it appears to be correlated to Mark Bedau's evolutionary

This is the paper I usually refer to, although his ideas have
evolved somewhat since 1998:

M. A. Bedau, E. Snyder, N. H. Packard. 1998. A Classification of
Long-Term Evolutionary Dynamics. In C. Adami, R. Belew, H. Kitano,
and C. Taylor, eds., Artificial Life VI, pp. 228-237. Cambridge: MIT
Press. Also published as Working Paper No.98-03-025, Santa Fe
Institute, Santa Fe, NM.


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to