On 20.02.2012 22:43 Russell Standish said the following:
On Mon, Feb 20, 2012 at 07:33:13PM +0100, Evgenii Rudnyi wrote:


I have nothing against Adami's book as such. His description of
his software avida and his experiments with it are okay. My point
was about his claim that his work has something to do with
thermodynamics. It is definitely not. The thermodynamic entropy is
not there. The quotes from the book displays this pretty clear.

You have written about "an analogous role". I would not object if

Chris uses the word analogy to connect mutation and temperature. But
not between information and entropy.

This what I have meant and this is the point where I disagree. Adami comes to the conclusion that the thermodynamic entropy is subjective. Let me quote him again

p. 96 “If an observer gains knowledge about the system and thus determines that a number of states that were previously deemed probable are in fact unlikely, the entropy of the system (which now has turned into a conditional entropy), is lowered, simply because the number of different possible states in the lower. (Note that such a change in uncertainty is usually due to a measurement).

p. 97 “Clearly, the entropy can also depend on what we consider “different”. For example, one may count states as different that differ by, at most, del_x in some observable x (for example, the color of a ball drawn from an ensemble of differently shaded balls in an urn). Such entropies are then called fine-grained (if del_x is small), or course-grained (if del_x is large) entropies.”

The entropy he is talking about in these quotes has nothing to do with the thermodynamic entropy. You can close or open your eyes, the entropies as determined in the JANAF Tables do not change.



you say that there is an analogy between the thermodynamic entropy
and information. Yet, I am against the statement that the
thermodynamic entropy is information and I believe that I have
given many examples that show this. Thermodynamic entropy is not
subjective and not context dependent*, so my claim is that Adami
does not understand what the thermodynamic entropy is. He has
never taken a class in experimental thermodynamics, this is the
problem.


I can't speak for Chris, but somehow I doubt that very much.

* I would accept the notation that the entropy is context
dependent in a sense that its definition depends on the
thermodynamics theory. If we change the theory, then the entropy
could have some other meaning. But it seems not what you have
meant.



It is true that in thermodynamics, there is usually little argument
about what the macroscopic variables are. As a consequence, entropy
is essentially an objective quantity, and the context fades into the
background.

But even between (micro-/grand-) canonical ensembles, there are
subtle differences between what macroscopic variables are
significant, hence difference between the entropies, which vanish in
the thermodynamic limit.


What does it mean for the application I have mentioned and for information in the IT? I still do not understand this, as the numerical values of information in IT and as derived from the thermodynamic entropy are quite different. Hence it is completely unclear how to use this in practical applications. Then what does it bring?

You have written about semiotics

"I've never seen one useful conjecture come out of it."

What are useful conjecture from saying that because the equations for information and the entropy are the same, they must be the same thing?

Evgenii

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to