I used to work in chemical thermodynamics for awhile and I give you the
answer from such a viewpoint. As this is the area that I know, then my
message will be a bit long and I guess it differs from the viewpoint of
people in information theory.
First entropy has been defined in classical thermodynamics and the best
is to start with it. Basically here
The Zeroth Law defines the temperature. "If two systems are in thermal
equilibrium with a third system, then they are in thermal equilibrium
with each other".
The Second Law defines the entropy. "There exist an additive state
function such that dS >= dQ/T" (The heat Q is not a state function)
The Third Law additionally defines that at zero K the change in entropy
is zero for all processes that allows us to define unambiguously the
absolute entropy. Note that for the energy we always have the difference
only (with an exception of E = mc^2).
That's it. The rest follows from above, well clearly you need also the
First Law to define the internal energy. I mean this is enough to
determine entropy in practical applications. Please just tell me entropy
of what do you want to evaluate and I will describe you how it could be
A nice book about classical thermodynamics is The Tragicomedy of
Classical Thermodynamics by Truesdell but please do not take it too
seriously. Everything that he writes is correct but somehow classical
thermodynamics survived until now, though I am afraid it is a bit
exotic. Well, if someone needs numerical values of the entropy, then
people do it the usual way of classical thermodynamics.
Statistical thermodynamics was developed after the classical
thermodynamics and I guess many believe that it has completely replaced
the classical thermodynamics. The Boltzmann equation for the entropy
looks so attractive that most people are acquainted with it only and I
am afraid that they do not quite know the business with heat engines
that actually were the original point for the entropy.
Here let me repeat that I have written recently to this list about heat
vs. molecular motion, as this give you an idea about the difference
between statistical and classical thermodynamics (replace heat by
classical thermodynamics and molecular motion by statistical).
At the beginning, the molecules and atoms were considered as hard
spheres. At this state, there was the problem as follows. We bring a
glass of hot water in the room and leave it there. Eventually the
temperature of the water will be equal to the ambient temperature.
According to the heat theory, the temperature in the glass will be hot
again spontaneously and it is in complete agreement with our experience.
With molecular motion, if we consider them as hard spheres there is a
nonzero chance that the water in the glass will be hot again. Moreover,
there is a theorem (Poincaré recurrence) that states that if we wait
long enough then the temperature of the glass must be hot again. No
doubt, the chances are very small and time to wait is very long, in a
way this is negligible. Yet some people are happy with such statistical
explanation, some not. Hence, it is a bit too simple to say that
molecular motion has eliminated heat at this level.
Shannon has defined the information entropy similar way to the Boltzmann
equation for the entropy. Since them many believe that Shannon's entropy
is the same as the thermodynamic entropy. In my view this is wrong as
this is why
I believe that here everything depends on definitions and if we start
with the entropy as defined by classical thermodynamics then it has
nothing to do with information.
INFORMATION AND THERMODYNAMIC ENTROPY
Said above, in my viewpoint there is meaningful research where people
try to estimate the thermodynamic limit for the number of operations.
The idea here to use kT as a reference. I remember that there was a nice
description on that with references in
Nanoelectronics and Information Technology, ed Rainer Waser
I believe that somewhere in introduction but now I am not sure now. By
the way the book is very good but I am not sure if it as such is what
you are looking for.
On 15.04.2011 02:27 Colin Hales said the following:
Hi all, I was wondering if anyone out there knows of any papers that
connect computational processes to thermodynamics in some organized
fashion. The sort of thing I am looking for would have statements
cooling is ....(info/computational equivalent) pressure is
..(info/computational equivalent) temperature is .... volume is ....
entropy is ....
I have found a few but I think I am missing the good stuff. here's
Reiss, H. 'Thermodynamic-Like Transformations in Information Theory',
Journal of Statistical Physics vol. 1, no. 1, 1969. 107-131.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at