`Entropy and information are related. In classical thermodynamics the`

`relation is between what constraint you impose on the substance and`

`dQ/T. You note that it is calculated assuming constant pressure - that`

`is a constraint; another is assuming constant energy. In terms of the`

`phase space in a statistical mechanics model, this is confining the`

`system to a hypersurface in the the phase space. If you had more`

`information about the system, e.g. you knew all the molecules were`

`moving the same direction (as in a rocket exhaust) that you further`

`reduce the part of phase space and the entropy. If you knew the`

`proportions of molecular species that would reduce it further. In`

`rocket exhaust calculations the assumption of fixed species proportion`

`is often made as an approximation - it's referred to as a frozen entropy`

`calculation. If the species react that changes the size of phase space`

`and hence the Boltzmann measure of entropy.`

Brent

## Advertising

On 4/15/2011 12:09 PM, Evgenii Rudnyi wrote:

Colin,I used to work in chemical thermodynamics for awhile and I give youthe answer from such a viewpoint. As this is the area that I know,then my message will be a bit long and I guess it differs from theviewpoint of people in information theory.CLASSICAL THERMODYNAMICSFirst entropy has been defined in classical thermodynamics and thebest is to start with it. Basically hereThe Zeroth Law defines the temperature. "If two systems are in thermalequilibrium with a third system, then they are in thermal equilibriumwith each other".The Second Law defines the entropy. "There exist an additive statefunction such that dS >= dQ/T" (The heat Q is not a state function)The Third Law additionally defines that at zero K the change inentropy is zero for all processes that allows us to defineunambiguously the absolute entropy. Note that for the energy we alwayshave the difference only (with an exception of E = mc^2).That's it. The rest follows from above, well clearly you need also theFirst Law to define the internal energy. I mean this is enough todetermine entropy in practical applications. Please just tell meentropy of what do you want to evaluate and I will describe you how itcould be done.A nice book about classical thermodynamics is The Tragicomedy ofClassical Thermodynamics by Truesdell but please do not take it tooseriously. Everything that he writes is correct but somehow classicalthermodynamics survived until now, though I am afraid it is a bitexotic. Well, if someone needs numerical values of the entropy, thenpeople do it the usual way of classical thermodynamics.STATISTICAL THERMODYNAMICSStatistical thermodynamics was developed after the classicalthermodynamics and I guess many believe that it has completelyreplaced the classical thermodynamics. The Boltzmann equation for theentropy looks so attractive that most people are acquainted with itonly and I am afraid that they do not quite know the business withheat engines that actually were the original point for the entropy.Here let me repeat that I have written recently to this list aboutheat vs. molecular motion, as this give you an idea about thedifference between statistical and classical thermodynamics (replaceheat by classical thermodynamics and molecular motion by statistical).At the beginning, the molecules and atoms were considered as hardspheres. At this state, there was the problem as follows. We bring aglass of hot water in the room and leave it there. Eventually thetemperature of the water will be equal to the ambient temperature.According to the heat theory, the temperature in the glass will be hotagain spontaneously and it is in complete agreement with ourexperience. With molecular motion, if we consider them as hard spheresthere is a nonzero chance that the water in the glass will be hotagain. Moreover, there is a theorem (PoincarĂ© recurrence) that statesthat if we wait long enough then the temperature of the glass must behot again. No doubt, the chances are very small and time to wait isvery long, in a way this is negligible. Yet some people are happy withsuch statistical explanation, some not. Hence, it is a bit too simpleto say that molecular motion has eliminated heat at this level.INFORMATION ENTROPYShannon has defined the information entropy similar way to theBoltzmann equation for the entropy. Since them many believe thatShannon's entropy is the same as the thermodynamic entropy. In my viewthis is wrong as this is whyhttp://blog.rudnyi.ru/2010/12/entropy-and-artificial-life.htmlI believe that here everything depends on definitions and if we startwith the entropy as defined by classical thermodynamics then it hasnothing to do with information.INFORMATION AND THERMODYNAMIC ENTROPYSaid above, in my viewpoint there is meaningful research where peopletry to estimate the thermodynamic limit for the number of operations.The idea here to use kT as a reference. I remember that there was anice description on that with references inNanoelectronics and Information Technology, ed Rainer WaserI believe that somewhere in introduction but now I am not sure now. Bythe way the book is very good but I am not sure if it as such is whatyou are looking for.Evgenii On 15.04.2011 02:27 Colin Hales said the following:Hi all, I was wondering if anyone out there knows of any papers that connect computational processes to thermodynamics in some organized fashion. The sort of thing I am looking for would have statements saying cooling is ....(info/computational equivalent) pressure is ..(info/computational equivalent) temperature is .... volume is .... entropy is .... I have found a few but I think I am missing the good stuff. here's one ... Reiss, H. 'Thermodynamic-Like Transformations in Information Theory', Journal of Statistical Physics vol. 1, no. 1, 1969. 107-131. cheers colin

-- You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to everything-list@googlegroups.com. To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.