On 05.02.2012 22:33 Russell Standish said the following:
On Sun, Feb 05, 2012 at 07:28:47PM +0100, Evgenii Rudnyi wrote:
The most funny it looks in the conclusion

p. 28(142) "First, all notions of entropy discussed in this essay,
except the thermodynamic and the topological entropy, can be
understood as variants of some information-theoretic notion of
entropy."

I understand it this way. When I am working with gas, liquid or
solid at the level of experimental thermodynamics, the information
according to the authors is not there (at this point I am in
agreement with them). Yet, as soon as theoretical physicists start
thinking about these objects, they happen to be fully filled with
information.

Evgenii

Would you say that thermodynamic entropy is the same as the
Bolztmann-Gibbs formulation? If not, then why not? You will
certainly be arguing against orthodoxy, but you're welcome to try.

There is some difference between the entropy and classical and statistical thermodynamics. I will copy my old text to describe it.

In order to explain you this, let us consider a simple experiment. We bring a glass of hot water in the room and leave it there. Eventually the temperature of the water will be equal to the ambient temperature. In classical thermodynamics, this process is considered as irreversible, that is, the Second Law forbids that the temperature in the glass will be hot again spontaneously. It is in complete agreement with our experience, so one would expect the same from statistical mechanics. However there the entropy has some statistical meaning and there is a nonzero chance that the water will be hot again. Moreover, there is a theorem (Poincaré recurrence) that states that if we wait long enough then the temperature of the glass must be hot again.

Otherwise, they are the same. This does not mean however that the information come into the play in the Boltzmann-Gibbs formulation. You have missed my comment to this, hence I will repeat it.

On 05.02.2012 19:28 Evgenii Rudnyi said the following:
...
> I have browsed the paper. I should say that I am not impressed. The
> logic is exactly the same as in other papers and books.
>
> I have nothing against the Shannon entropy (Section 3 in the paper).
>  Shannon can use the term entropy (why not) but then we should just
> distinguish between the informational entropy and the thermodynamic
> entropy as they have been introduced for completely different
> problems.
>
> The logic that both entropies are the same is in Section 4 and it is
>  expressed bluntly as
>
> p. 13 (127) "Then, if we regard the w_i as messages, S_B,m(M_i) is
> equivalent to the Shannon entropy up to the multiplicative constant
> nk and the additive constant C."
>
> p. 15 (129) "The most straightforward connection is between the Gibbs
>  entropy and the continuous Shannon entropy, which differ only by the
>  multiplicative constant k."
>
> Personally I find this clumsy. In my view, the same mathematical
> structure of equations does not say that the phenomena are related.
> For example the Poisson equation for electrostatics is mathematically
>  equivalent to the stationary heat conduction equation. So what?
> Well, one creative use is for people who have a thermal FEM solver
> and do not have an electrostatic solver. They can solve an
> electrostatic problem by using a thermal FEM solver by means of
> mathematical analogy. This does happen but I doubt that we could
> state that the stationary heat conduction is equivalent to
> electrostatics.

In my option, the similarity of mathematical equations does not mean that the phenomena are the same. Basically it is about definitions. If you define the information through the Shannon entropy, it is okay. You have however to prove that the Shannon entropy is the same as the thermodynamic entropy. In this respect, the similarity of equations, in my view, is a weak argument.

Do you have anything else to support that the thermodynamic entropy is information except that the two equations are similar with each other?

Evgenii


If you agree that it is the same, then surely you can see that
information and entropy are related - they are both the logarithm of
a probability - in the case of Boltzmann it is the logarithm of the
number of possible microstates multiplied by the probability of the
thermodynamic state.

Are you aware of the result relating the Kolmogorov "program length"
complexity measure to the logarithm of the probability of that
program appearing in the universal prior?

Both are examples of information, but measured in different
contexts.

I will comment on the entropy context of the JANAF tables in another
post. Essentially you are asserting that the context of those tables
is the only context under which thermodynamic entropy makes sense.
All other contexts for which there is an entropy-like quantity do
not count, and those measures should be called something else. A
variety of information, or complexity perhaps.

Alternatively, we could recognise the modern understanding that
these terms are all essentially equivalent, but that they refer to a
family of measures that vary depending on the context.

It comes down to a terminological argument, sure, but your
insistence that thermodynamic entropy is a special case strikes me as
a baroque means of hiding the thermodynamic context - one that
doesn't engender understanding of the topic.


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to