Re: Request: computation=thermodynamics paper(s)

2011-04-16 Thread Evgenii Rudnyi

On 15.04.2011 21:44 meekerdb said the following:

Entropy and information are related. In classical thermodynamics the
 relation is between what constraint you impose on the substance and
 dQ/T. You note that it is calculated assuming constant pressure -
that is a constraint; another is assuming constant energy. In terms
of the phase space in a statistical mechanics model, this is
confining the system to a hypersurface in the the phase space. If you
had more information about the system, e.g. you knew all the
molecules were moving the same direction (as in a rocket exhaust)
that you further reduce the part of phase space and the entropy. If
you knew the proportions of molecular species that would reduce it
further. In rocket exhaust calculations the assumption of fixed
species proportion is often made as an approximation - it's referred
to as a frozen entropy calculation. If the species react that changes
the size of phase space and hence the Boltzmann measure of entropy.

Brent


First how do you define information? According to Shannon?

Then if we consider a thermodynamic system, the Second Law

dS = dQ/T

does not impose constraints as such. It is held for any closed system 
and for any process. The only assumption here is that the system 
possesses a temperature. If one can define temperature than the entropy 
follows according to the Second Law unambiguously and I do not see how 
one additionally will need information, whatever it means.


If you speak about reaction chemistry, let us consider a simple exercise 
from classical thermodynamics.


Problem. Given temperature, pressure, and initial number of moles of
NH3, N2 and H2, compute the equilibrium composition.

To solve the problem one should find thermodynamic properties of NH3, N2 
and H2 for example in the JANAF Tables and then compute the equilibrium 
constant.


From thermodynamics tables (all values are molar values for the 
standard pressure 1 bar, I have omitted the symbol o for simplicity but 
it is very important not to forget it):


Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2), 
Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)


2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) - 2 Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(H2) - 2 S_298(NH3)

Del_Cp_r = Cp(N2) + 3 Cp(H2) - 2 Cp(NH3)

To make life simple, I will assume below that Del_Cp_r = 0, but it is 
not a big deal to extend the equations to include heat capacities as well.


Del_G_r_T = Del_H_r_298 - T Del_S_r_298

Del_G_r_T = - R T ln Kp

When Kp, total pressure and the initial number of moles are given, it is 
rather straightforward to compute equilibrium composition. So, the 
entropy is there. What do you mean when you state that information is 
also involved? Where is in this example the related information, again 
whatever it is?


--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Request: computation=thermodynamics paper(s)

2011-04-15 Thread Bruno Marchal

Hi Colin,

Energy cost is due to erasure of information only (Landauer  
principle), and you can compute without erasing anything, as you need  
to do if you do quantum computation. You might search on Landauer,  
Bennett, Zurek, and on the Maxwell daemon.


Bruno


On 15 Apr 2011, at 02:27, Colin Hales wrote:


Hi all,
I was wondering if anyone out there knows of any papers that connect  
computational processes to thermodynamics in some organized fashion.  
The sort of thing I am looking for would have statements saying


cooling is (info/computational equivalent)
pressure is ..(info/computational equivalent)
temperature is 
volume is 
entropy is 

I have found a few but I think I am missing the good stuff.
here's one ...

Reiss, H. 'Thermodynamic-Like Transformations in Information  
Theory', Journal of Statistical Physics vol. 1, no. 1, 1969. 107-131.


cheers
colin

--
You received this message because you are subscribed to the Google  
Groups Everything List group.

To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en 
.




http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Request: computation=thermodynamics paper(s)

2011-04-15 Thread Evgenii Rudnyi

Colin,

I used to work in chemical thermodynamics for awhile and I give you the 
answer from such a viewpoint. As this is the area that I know, then my 
message will be a bit long and I guess it differs from the viewpoint of 
people in information theory.


CLASSICAL THERMODYNAMICS

First entropy has been defined in classical thermodynamics and the best 
is to start with it. Basically here


The Zeroth Law defines the temperature. If two systems are in thermal 
equilibrium with a third system, then they are in thermal equilibrium 
with each other.


The Second Law defines the entropy. There exist an additive state 
function such that dS = dQ/T (The heat Q is not a state function)


The Third Law additionally defines that at zero K the change in entropy 
is zero for all processes that allows us to define unambiguously the 
absolute entropy. Note that for the energy we always have the difference 
only (with an exception of E = mc^2).


That's it. The rest follows from above, well clearly you need also the 
First Law to define the internal energy. I mean this is enough to 
determine entropy in practical applications. Please just tell me entropy 
of what do you want to evaluate and I will describe you how it could be 
done.


A nice book about classical thermodynamics is The Tragicomedy of 
Classical Thermodynamics by Truesdell but please do not take it too 
seriously. Everything that he writes is correct but somehow classical 
thermodynamics survived until now, though I am afraid it is a bit 
exotic. Well, if someone needs numerical values of the entropy, then 
people do it the usual way of classical thermodynamics.


STATISTICAL THERMODYNAMICS

Statistical thermodynamics was developed after the classical 
thermodynamics and I guess many believe that it has completely replaced 
the classical thermodynamics. The Boltzmann equation for the entropy 
looks so attractive that most people are acquainted with it only and I 
am afraid that they do not quite know the business with heat engines 
that actually were the original point for the entropy.


Here let me repeat that I have written recently to this list about heat 
vs. molecular motion, as this give you an idea about the difference 
between statistical and classical thermodynamics (replace heat by 
classical thermodynamics and molecular motion by statistical).


At the beginning, the molecules and atoms were considered as hard 
spheres. At this state, there was the problem as follows. We bring a 
glass of hot water in the room and leave it there. Eventually the 
temperature of the water will be equal to the ambient temperature. 
According to the heat theory, the temperature in the glass will be hot 
again spontaneously and it is in complete agreement with our experience. 
With molecular motion, if we consider them as hard spheres there is a 
nonzero chance that the water in the glass will be hot again. Moreover, 
there is a theorem (Poincaré recurrence) that states that if we wait 
long enough then the temperature of the glass must be hot again. No 
doubt, the chances are very small and time to wait is very long, in a 
way this is negligible. Yet some people are happy with such statistical 
explanation, some not. Hence, it is a bit too simple to say that 
molecular motion has eliminated heat at this level.


INFORMATION ENTROPY

Shannon has defined the information entropy similar way to the Boltzmann 
equation for the entropy. Since them many believe that Shannon's entropy 
is the same as the thermodynamic entropy. In my view this is wrong as 
this is why


http://blog.rudnyi.ru/2010/12/entropy-and-artificial-life.html

I believe that here everything depends on definitions and if we start 
with the entropy as defined by classical thermodynamics then it has 
nothing to do with information.


INFORMATION AND THERMODYNAMIC ENTROPY

Said above, in my viewpoint there is meaningful research where people 
try to estimate the thermodynamic limit for the number of operations. 
The idea here to use kT as a reference. I remember that there was a nice 
description on that with references in


Nanoelectronics and Information Technology, ed Rainer Waser

I believe that somewhere in introduction but now I am not sure now. By 
the way the book is very good but I am not sure if it as such is what 
you are looking for.


Evgenii



On 15.04.2011 02:27 Colin Hales said the following:

Hi all, I was wondering if anyone out there knows of any papers that
connect computational processes to thermodynamics in some organized
fashion. The sort of thing I am looking for would have statements
saying

cooling is (info/computational equivalent) pressure is
..(info/computational equivalent) temperature is  volume is 
entropy is 

I have found a few but I think I am missing the good stuff. here's
one ...

Reiss, H. 'Thermodynamic-Like Transformations in Information Theory',
 Journal of Statistical Physics vol. 1, no. 1, 1969. 107-131.

cheers colin



--
You received this 

Re: Request: computation=thermodynamics paper(s)

2011-04-15 Thread meekerdb
Entropy and information are related.  In classical thermodynamics the 
relation is between what constraint you impose on the substance and 
dQ/T.  You note that it is calculated assuming constant pressure - that 
is a constraint; another is assuming constant energy.  In terms of the 
phase space in a statistical mechanics model, this is confining the 
system to a hypersurface in the the phase space.  If you had more 
information about the system, e.g. you knew all the molecules were 
moving the same direction (as in a rocket exhaust) that you further 
reduce the part of phase space and the entropy.  If you knew the 
proportions of molecular species that would reduce it further.  In 
rocket exhaust calculations the assumption of fixed species proportion 
is often made as an approximation - it's referred to as a frozen entropy 
calculation.   If the species react that changes the size of phase space 
and hence the Boltzmann measure of entropy.


Brent

On 4/15/2011 12:09 PM, Evgenii Rudnyi wrote:

Colin,

I used to work in chemical thermodynamics for awhile and I give you 
the answer from such a viewpoint. As this is the area that I know, 
then my message will be a bit long and I guess it differs from the 
viewpoint of people in information theory.


CLASSICAL THERMODYNAMICS

First entropy has been defined in classical thermodynamics and the 
best is to start with it. Basically here


The Zeroth Law defines the temperature. If two systems are in thermal 
equilibrium with a third system, then they are in thermal equilibrium 
with each other.


The Second Law defines the entropy. There exist an additive state 
function such that dS = dQ/T (The heat Q is not a state function)


The Third Law additionally defines that at zero K the change in 
entropy is zero for all processes that allows us to define 
unambiguously the absolute entropy. Note that for the energy we always 
have the difference only (with an exception of E = mc^2).


That's it. The rest follows from above, well clearly you need also the 
First Law to define the internal energy. I mean this is enough to 
determine entropy in practical applications. Please just tell me 
entropy of what do you want to evaluate and I will describe you how it 
could be done.


A nice book about classical thermodynamics is The Tragicomedy of 
Classical Thermodynamics by Truesdell but please do not take it too 
seriously. Everything that he writes is correct but somehow classical 
thermodynamics survived until now, though I am afraid it is a bit 
exotic. Well, if someone needs numerical values of the entropy, then 
people do it the usual way of classical thermodynamics.


STATISTICAL THERMODYNAMICS

Statistical thermodynamics was developed after the classical 
thermodynamics and I guess many believe that it has completely 
replaced the classical thermodynamics. The Boltzmann equation for the 
entropy looks so attractive that most people are acquainted with it 
only and I am afraid that they do not quite know the business with 
heat engines that actually were the original point for the entropy.


Here let me repeat that I have written recently to this list about 
heat vs. molecular motion, as this give you an idea about the 
difference between statistical and classical thermodynamics (replace 
heat by classical thermodynamics and molecular motion by statistical).


At the beginning, the molecules and atoms were considered as hard 
spheres. At this state, there was the problem as follows. We bring a 
glass of hot water in the room and leave it there. Eventually the 
temperature of the water will be equal to the ambient temperature. 
According to the heat theory, the temperature in the glass will be hot 
again spontaneously and it is in complete agreement with our 
experience. With molecular motion, if we consider them as hard spheres 
there is a nonzero chance that the water in the glass will be hot 
again. Moreover, there is a theorem (Poincaré recurrence) that states 
that if we wait long enough then the temperature of the glass must be 
hot again. No doubt, the chances are very small and time to wait is 
very long, in a way this is negligible. Yet some people are happy with 
such statistical explanation, some not. Hence, it is a bit too simple 
to say that molecular motion has eliminated heat at this level.


INFORMATION ENTROPY

Shannon has defined the information entropy similar way to the 
Boltzmann equation for the entropy. Since them many believe that 
Shannon's entropy is the same as the thermodynamic entropy. In my view 
this is wrong as this is why


http://blog.rudnyi.ru/2010/12/entropy-and-artificial-life.html

I believe that here everything depends on definitions and if we start 
with the entropy as defined by classical thermodynamics then it has 
nothing to do with information.


INFORMATION AND THERMODYNAMIC ENTROPY

Said above, in my viewpoint there is meaningful research where people 
try to estimate the thermodynamic limit for the number of operations. 

Re: Request: computation=thermodynamics paper(s)

2011-04-14 Thread Russell Standish
Slizard did a whole bunch of stuff in this area in the 1940s. Feynmann
has some good introductions to it in his Lectures in Physics series (I
forget which volume), IIRC. This was more focussed on the
thermodynamics of computation (eg what efficiency limits are there on
processing bits).

Later on, there was some work basing statistical mechanics on
information theory. Denbigh and Denbigh was a good book from the early
'80s that talked about this. This stuff is kind of the reverse side of
the coin to Slizard's stuff.

Cheers

On Fri, Apr 15, 2011 at 10:27:45AM +1000, Colin Hales wrote:
 Hi all,
 I was wondering if anyone out there knows of any papers that connect
 computational processes to thermodynamics in some organized fashion.
 The sort of thing I am looking for would have statements saying
 
 cooling is (info/computational equivalent)
 pressure is ..(info/computational equivalent)
 temperature is 
 volume is 
 entropy is 
 
 I have found a few but I think I am missing the good stuff.
 here's one ...
 
 Reiss, H. 'Thermodynamic-Like Transformations in Information
 Theory', Journal of Statistical Physics vol. 1, no. 1, 1969.
 107-131.
 
 cheers
 colin
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 Everything List group.
 To post to this group, send email to everything-list@googlegroups.com.
 To unsubscribe from this group, send email to 
 everything-list+unsubscr...@googlegroups.com.
 For more options, visit this group at 
 http://groups.google.com/group/everything-list?hl=en.

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Mathematics  
UNSW SYDNEY 2052 hpco...@hpcoders.com.au
Australiahttp://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.