I think entropy is better intuitively understood as uncertanty. The entropy of a gas is the uncertanty of the particle positions and velocities. The hotter it is the more uncertanties there are. A certain amount of information is required to eliminate this uncertanty.

Jason

On Feb 5, 2012, at 12:28 PM, Evgenii Rudnyi <use...@rudnyi.ru> wrote:

On 05.02.2012 17:16 Evgenii Rudnyi said the following:
On 24.01.2012 22:56 meekerdb said the following:

In thinking about how to answer this I came across an excellent
paper by Roman Frigg and Charlotte Werndl
http://www.romanfrigg.org/writings/EntropyGuide.pdf which
explicates the relation more comprehensively than I could and which
also gives some historical background and extensions: specifically
look at section 4.

Brent

I have browsed the paper. I should say that I am not impressed. The logic is exactly the same as in other papers and books.

I have nothing against the Shannon entropy (Section 3 in the paper). Shannon can use the term entropy (why not) but then we should just distinguish between the informational entropy and the thermodynamic entropy as they have been introduced for completely different problems.

The logic that both entropies are the same is in Section 4 and it is expressed bluntly as

p. 13 (127) "Then, if we regard the w_i as messages, S_B,m(M_i) is equivalent to the Shannon entropy up to the multiplicative constant nk and the additive constant C."

p. 15 (129) "The most straightforward connection is between the Gibbs entropy and the continuous Shannon entropy, which differ only by the multiplicative constant k."

Personally I find this clumsy. In my view, the same mathematical structure of equations does not say that the phenomena are related. For example the Poisson equation for electrostatics is mathematically equivalent to the stationary heat conduction equation. So what? Well, one creative use is for people who have a thermal FEM solver and do not have an electrostatic solver. They can solve an electrostatic problem by using a thermal FEM solver by means of mathematical analogy. This does happen but I doubt that we could state that the stationary heat conduction is equivalent to electrostatics.

The most funny it looks in the conclusion

p. 28(142) "First, all notions of entropy discussed in this essay, except the thermodynamic and the topological entropy, can be understood as variants of some information-theoretic notion of entropy."

I understand it this way. When I am working with gas, liquid or solid at the level of experimental thermodynamics, the information according to the authors is not there (at this point I am in agreement with them). Yet, as soon as theoretical physicists start thinking about these objects, they happen to be fully filled with information.

Evgenii

Brent,

I have started reading the pdf. A few comments to section 2 Entropy
in thermodynamics.

The authors seem to be sloppy.

1) p. 2 (116). "If we consider a cyclical process—a process in wh ich
the beginning and the end state are the same — a reversible process
leaves the system and its surroundings unchanged."

This is wrong, as one runs the Carnot cycle reversibly, then the heat
will be converted to work (or vice versa) and there will be changes
in the surroundings. They probably mean that if one runs the Carnot
cycle reversibly twice, first in one direction and then in the
opposite, then the surrounding will be unchanged.

2) p. 2(116). "We can then assign an absolute entropy value to every
state of the system by choosing one particular state A (we can
choose any state we please!) as the reference point."

They misuse the conventional terminology. The absolute entropy is
defined by the Third Law and they just want employ S instead of Del
S. It is pretty dangerous, as when one changes the working body in
the Carnot cycle, then such a notation will lead to a catastrophe.

3) p.3(117). "If we now restrict attention to adiathermal processes
(i.e. ones in which temperature is constant),"

According to Eq 4 that they discuss they mean an adiabatic process
where temperature is not constant.

However, at the end of this small section they write

p. 3(117). "S_TD has no intuitive interpretation as a measure of
disorder, disorganization, or randomness (as is often claimed). In
fact such considerations have no place in TD."

I completely agree with that, so I am going to read further.

Evgenii


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com . For more options, visit this group at http://groups.google.com/group/everything-list?hl=en .


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to