`I think entropy is better intuitively understood as uncertanty. The`

`entropy of a gas is the uncertanty of the particle positions and`

`velocities. The hotter it is the more uncertanties there are. A`

`certain amount of information is required to eliminate this uncertanty.`

Jason

## Advertising

On Feb 5, 2012, at 12:28 PM, Evgenii Rudnyi <use...@rudnyi.ru> wrote:

On 05.02.2012 17:16 Evgenii Rudnyi said the following:On 24.01.2012 22:56 meekerdb said the following:In thinking about how to answer this I came across an excellent paper by Roman Frigg and Charlotte Werndl http://www.romanfrigg.org/writings/EntropyGuide.pdf which explicates the relation more comprehensively than I could and which also gives some historical background and extensions: specifically look at section 4. BrentI have browsed the paper. I should say that I am not impressed. Thelogic is exactly the same as in other papers and books.I have nothing against the Shannon entropy (Section 3 in the paper).Shannon can use the term entropy (why not) but then we should justdistinguish between the informational entropy and the thermodynamicentropy as they have been introduced for completely differentproblems.The logic that both entropies are the same is in Section 4 and it isexpressed bluntly asp. 13 (127) "Then, if we regard the w_i as messages, S_B,m(M_i) isequivalent to the Shannon entropy up to the multiplicative constantnk and the additive constant C."p. 15 (129) "The most straightforward connection is between theGibbs entropy and the continuous Shannon entropy, which differ onlyby the multiplicative constant k."Personally I find this clumsy. In my view, the same mathematicalstructure of equations does not say that the phenomena are related.For example the Poisson equation for electrostatics ismathematically equivalent to the stationary heat conductionequation. So what? Well, one creative use is for people who have athermal FEM solver and do not have an electrostatic solver. They cansolve an electrostatic problem by using a thermal FEM solver bymeans of mathematical analogy. This does happen but I doubt that wecould state that the stationary heat conduction is equivalent toelectrostatics.The most funny it looks in the conclusionp. 28(142) "First, all notions of entropy discussed in this essay,except the thermodynamic and the topological entropy, can beunderstood as variants of some information-theoretic notion ofentropy."I understand it this way. When I am working with gas, liquid orsolid at the level of experimental thermodynamics, the informationaccording to the authors is not there (at this point I am inagreement with them). Yet, as soon as theoretical physicists startthinking about these objects, they happen to be fully filled withinformation.EvgeniiBrent, I have started reading the pdf. A few comments to section 2 Entropy in thermodynamics. The authors seem to be sloppy.1) p. 2 (116). "If we consider a cyclical process—a process in whichthe beginning and the end state are the same — a reversible process leaves the system and its surroundings unchanged." This is wrong, as one runs the Carnot cycle reversibly, then the heat will be converted to work (or vice versa) and there will be changes in the surroundings. They probably mean that if one runs the Carnot cycle reversibly twice, first in one direction and then in the opposite, then the surrounding will be unchanged. 2) p. 2(116). "We can then assign an absolute entropy value to every state of the system by choosing one particular state A (we can choose any state we please!) as the reference point." They misuse the conventional terminology. The absolute entropy is defined by the Third Law and they just want employ S instead of Del S. It is pretty dangerous, as when one changes the working body in the Carnot cycle, then such a notation will lead to a catastrophe. 3) p.3(117). "If we now restrict attention to adiathermal processes (i.e. ones in which temperature is constant)," According to Eq 4 that they discuss they mean an adiabatic process where temperature is not constant. However, at the end of this small section they write p. 3(117). "S_TD has no intuitive interpretation as a measure of disorder, disorganization, or randomness (as is often claimed). In fact such considerations have no place in TD." I completely agree with that, so I am going to read further. Evgenii--You received this message because you are subscribed to the GoogleGroups "Everything List" group.To post to this group, send email to everything-list@googlegroups.com.To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com.For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.

-- You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to everything-list@googlegroups.com. To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.