On 2/6/2012 11:18 AM, Evgenii Rudnyi wrote:
On 05.02.2012 22:33 Russell Standish said the following:
On Sun, Feb 05, 2012 at 07:28:47PM +0100, Evgenii Rudnyi wrote:
The most funny it looks in the conclusion

p. 28(142) "First, all notions of entropy discussed in this essay,
except the thermodynamic and the topological entropy, can be
understood as variants of some information-theoretic notion of

I understand it this way. When I am working with gas, liquid or
solid at the level of experimental thermodynamics, the information
according to the authors is not there (at this point I am in
agreement with them). Yet, as soon as theoretical physicists start
thinking about these objects, they happen to be fully filled with


Would you say that thermodynamic entropy is the same as the
Bolztmann-Gibbs formulation? If not, then why not? You will
certainly be arguing against orthodoxy, but you're welcome to try.

There is some difference between the entropy and classical and statistical thermodynamics. I will copy my old text to describe it.

In order to explain you this, let us consider a simple experiment. We bring a glass of hot water in the room and leave it there. Eventually the temperature of the water will be equal to the ambient temperature. In classical thermodynamics, this process is considered as irreversible, that is, the Second Law forbids that the temperature in the glass will be hot again spontaneously. It is in complete agreement with our experience, so one would expect the same from statistical mechanics. However there the entropy has some statistical meaning and there is a nonzero chance that the water will be hot again. Moreover, there is a theorem (Poincaré recurrence) that states that if we wait long enough then the temperature of the glass must be hot again.

Otherwise, they are the same. This does not mean however that the information come into the play in the Boltzmann-Gibbs formulation. You have missed my comment to this, hence I will repeat it.

On 05.02.2012 19:28 Evgenii Rudnyi said the following:
> I have browsed the paper. I should say that I am not impressed. The
> logic is exactly the same as in other papers and books.
> I have nothing against the Shannon entropy (Section 3 in the paper).
>  Shannon can use the term entropy (why not) but then we should just
> distinguish between the informational entropy and the thermodynamic
> entropy as they have been introduced for completely different
> problems.
> The logic that both entropies are the same is in Section 4 and it is
>  expressed bluntly as
> p. 13 (127) "Then, if we regard the w_i as messages, S_B,m(M_i) is
> equivalent to the Shannon entropy up to the multiplicative constant
> nk and the additive constant C."
> p. 15 (129) "The most straightforward connection is between the Gibbs
>  entropy and the continuous Shannon entropy, which differ only by the
>  multiplicative constant k."
> Personally I find this clumsy. In my view, the same mathematical
> structure of equations does not say that the phenomena are related.
> For example the Poisson equation for electrostatics is mathematically
>  equivalent to the stationary heat conduction equation. So what?
> Well, one creative use is for people who have a thermal FEM solver
> and do not have an electrostatic solver. They can solve an
> electrostatic problem by using a thermal FEM solver by means of
> mathematical analogy. This does happen but I doubt that we could
> state that the stationary heat conduction is equivalent to
> electrostatics.

In my option, the similarity of mathematical equations does not mean that the phenomena are the same. Basically it is about definitions. If you define the information through the Shannon entropy, it is okay. You have however to prove that the Shannon entropy is the same as the thermodynamic entropy. In this respect, the similarity of equations, in my view, is a weak argument.

It is not based just on the similarity of equations. The equations are similar because they come from the same concepts. The Shannon information measure of the amount of phase space available to a system, given the value of some macro variables like temperature, pressure,... is proportional to the statistical mechanical entropy of the system. There are idealizations in the analysis, both on the thermodynamic and on the statistical mechanics side. Among the idealizations is the neglect of bulk shapes (e.g. the text stamped on a coin) and collective motions (e.g. acoustic waves).


Do you have anything else to support that the thermodynamic entropy is information except that the two equations are similar with each other?


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to