Re: Entropy: A Guide for the Perplexed

2012-02-07 Thread Evgenii Rudnyi

On 06.02.2012 21:10 meekerdb said the following:

On 2/6/2012 11:18 AM, Evgenii Rudnyi wrote:

On 05.02.2012 22:33 Russell Standish said the following:

On Sun, Feb 05, 2012 at 07:28:47PM +0100, Evgenii Rudnyi wrote:

The most funny it looks in the conclusion

p. 28(142) "First, all notions of entropy discussed in this
essay, except the thermodynamic and the topological entropy,
can be understood as variants of some information-theoretic
notion of entropy."

I understand it this way. When I am working with gas, liquid
or solid at the level of experimental thermodynamics, the
information according to the authors is not there (at this
point I am in agreement with them). Yet, as soon as theoretical
physicists start thinking about these objects, they happen to
be fully filled with information.

Evgenii


Would you say that thermodynamic entropy is the same as the
Bolztmann-Gibbs formulation? If not, then why not? You will
certainly be arguing against orthodoxy, but you're welcome to
try.


There is some difference between the entropy and classical and
statistical thermodynamics. I will copy my old text to describe
it.

In order to explain you this, let us consider a simple experiment.
We bring a glass of hot water in the room and leave it there.
Eventually the temperature of the water will be equal to the
ambient temperature. In classical thermodynamics, this process is
considered as irreversible, that is, the Second Law forbids that
the temperature in the glass will be hot again spontaneously. It is
in complete agreement with our experience, so one would expect the
same from statistical mechanics. However there the entropy has some
statistical meaning and there is a nonzero chance that the water
will be hot again. Moreover, there is a theorem (Poincaré
recurrence) that states that if we wait long enough then the
temperature of the glass must be hot again.

Otherwise, they are the same. This does not mean however that the
information come into the play in the Boltzmann-Gibbs formulation.
You have missed my comment to this, hence I will repeat it.

On 05.02.2012 19:28 Evgenii Rudnyi said the following: ...

I have browsed the paper. I should say that I am not impressed.
The logic is exactly the same as in other papers and books.

I have nothing against the Shannon entropy (Section 3 in the
paper). Shannon can use the term entropy (why not) but then we
should just distinguish between the informational entropy and the
thermodynamic entropy as they have been introduced for completely
different problems.

The logic that both entropies are the same is in Section 4 and it
is expressed bluntly as

p. 13 (127) "Then, if we regard the w_i as messages, S_B,m(M_i)
is equivalent to the Shannon entropy up to the multiplicative
constant nk and the additive constant C."

p. 15 (129) "The most straightforward connection is between the
Gibbs entropy and the continuous Shannon entropy, which differ
only by the multiplicative constant k."

Personally I find this clumsy. In my view, the same mathematical
structure of equations does not say that the phenomena are
related. For example the Poisson equation for electrostatics is
mathematically equivalent to the stationary heat conduction
equation. So what? Well, one creative use is for people who have
a thermal FEM solver and do not have an electrostatic solver.
They can solve an electrostatic problem by using a thermal FEM
solver by means of mathematical analogy. This does happen but I
doubt that we could state that the stationary heat conduction is
equivalent to electrostatics.


In my option, the similarity of mathematical equations does not
mean that the phenomena are the same. Basically it is about
definitions. If you define the information through the Shannon
entropy, it is okay. You have however to prove that the Shannon
entropy is the same as the thermodynamic entropy. In this respect,
the similarity of equations, in my view, is a weak argument.


It is not based just on the similarity of equations. The equations
are similar because they come from the same concepts. The Shannon
information measure of the amount of phase space available to a
system, given the value of some macro variables like temperature,
pressure,... is proportional to the statistical mechanical entropy of
the system. There are idealizations in the analysis, both on the
thermodynamic and on the statistical mechanics side. Among the
idealizations is the neglect of bulk shapes (e.g. the text stamped on
a coin) and collective motions (e.g. acoustic waves).


Brent,

I would suggest to look at the history briefly.

Statistical thermodynamics has been derived by Boltzmann and Gibbs and 
at that time there was no information in it. This lasted for quite 
awhile and many famous physicists have not found any information in 
statistical mechanics.


The information entropy has started with Shannon's work where he writes

"The form of H will be recognized as that of entropy as defined in 
certain formulations of stati

Re: Entropy: A Guide for the Perplexed

2012-02-06 Thread meekerdb

On 2/6/2012 11:18 AM, Evgenii Rudnyi wrote:

On 05.02.2012 22:33 Russell Standish said the following:

On Sun, Feb 05, 2012 at 07:28:47PM +0100, Evgenii Rudnyi wrote:

The most funny it looks in the conclusion

p. 28(142) "First, all notions of entropy discussed in this essay,
except the thermodynamic and the topological entropy, can be
understood as variants of some information-theoretic notion of
entropy."

I understand it this way. When I am working with gas, liquid or
solid at the level of experimental thermodynamics, the information
according to the authors is not there (at this point I am in
agreement with them). Yet, as soon as theoretical physicists start
thinking about these objects, they happen to be fully filled with
information.

Evgenii


Would you say that thermodynamic entropy is the same as the
Bolztmann-Gibbs formulation? If not, then why not? You will
certainly be arguing against orthodoxy, but you're welcome to try.


There is some difference between the entropy and classical and statistical 
thermodynamics. I will copy my old text to describe it.


In order to explain you this, let us consider a simple experiment. We bring a glass of 
hot water in the room and leave it there. Eventually the temperature of the water will 
be equal to the ambient temperature. In classical thermodynamics, this process is 
considered as irreversible, that is, the Second Law forbids that the temperature in the 
glass will be hot again spontaneously. It is in complete agreement with our experience, 
so one would expect the same from statistical mechanics. However there the entropy has 
some statistical meaning and there is a nonzero chance that the water will be hot again. 
Moreover, there is a theorem (Poincaré recurrence) that states that if we wait long 
enough then the temperature of the glass must be hot again.


Otherwise, they are the same. This does not mean however that the information come into 
the play in the Boltzmann-Gibbs formulation. You have missed my comment to this, hence I 
will repeat it.


On 05.02.2012 19:28 Evgenii Rudnyi said the following:
...
> I have browsed the paper. I should say that I am not impressed. The
> logic is exactly the same as in other papers and books.
>
> I have nothing against the Shannon entropy (Section 3 in the paper).
>  Shannon can use the term entropy (why not) but then we should just
> distinguish between the informational entropy and the thermodynamic
> entropy as they have been introduced for completely different
> problems.
>
> The logic that both entropies are the same is in Section 4 and it is
>  expressed bluntly as
>
> p. 13 (127) "Then, if we regard the w_i as messages, S_B,m(M_i) is
> equivalent to the Shannon entropy up to the multiplicative constant
> nk and the additive constant C."
>
> p. 15 (129) "The most straightforward connection is between the Gibbs
>  entropy and the continuous Shannon entropy, which differ only by the
>  multiplicative constant k."
>
> Personally I find this clumsy. In my view, the same mathematical
> structure of equations does not say that the phenomena are related.
> For example the Poisson equation for electrostatics is mathematically
>  equivalent to the stationary heat conduction equation. So what?
> Well, one creative use is for people who have a thermal FEM solver
> and do not have an electrostatic solver. They can solve an
> electrostatic problem by using a thermal FEM solver by means of
> mathematical analogy. This does happen but I doubt that we could
> state that the stationary heat conduction is equivalent to
> electrostatics.

In my option, the similarity of mathematical equations does not mean that the phenomena 
are the same. Basically it is about definitions. If you define the information through 
the Shannon entropy, it is okay. You have however to prove that the Shannon entropy is 
the same as the thermodynamic entropy. In this respect, the similarity of equations, in 
my view, is a weak argument.


It is not based just on the similarity of equations.  The equations are similar because 
they come from the same concepts.  The Shannon information measure of the amount of phase 
space available to a system, given the value of some macro variables like temperature, 
pressure,... is proportional to the statistical mechanical entropy of the system.  There 
are idealizations in the analysis, both on the thermodynamic and on the statistical 
mechanics side. Among the idealizations is the neglect of bulk shapes (e.g. the text 
stamped on a coin) and collective motions (e.g. acoustic waves).


Brent



Do you have anything else to support that the thermodynamic entropy is information 
except that the two equations are similar with each other?


Evgenii


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.

Re: Entropy: A Guide for the Perplexed

2012-02-06 Thread Evgenii Rudnyi

On 06.02.2012 17:44 Jason Resch said the following:

I think entropy is better intuitively understood as uncertanty. The
entropy of a gas is the uncertanty of the particle positions and
velocities. The hotter it is the more uncertanties there are. A
certain amount of information is required to eliminate this
uncertanty.

Jason


Could you please show how your definition of entropy could be employed 
to build for example the next phase diagram


http://www.calphad.com/graphs/Metastable%20Fe-C%20Phase%20Diagram.gif

If you find such a question too complicated, please consider the 
textbook level problem below and show how you will solve it using 
uncertainties.


Evgenii

———–
Problem. Given temperature, pressure, and initial number of moles of 
NH3, N2 and H2, compute the equilibrium composition.


To solve the problem one should find thermodynamic properties of NH3, N2 
and H2 for example in the JANAF Tables and then compute the equilibrium 
constant.


From thermodynamics tables (all values are molar values for the
standard pressure 1 bar, I have omitted the symbol o for simplicity but
it is very important not to forget it):

Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)

2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) – 2 Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(H2) – 2 S_298(NH3)

Del_Cp_r = Cp(N2) + 3 Cp(H2) – 2 Cp(NH3)

To make life simple, I will assume below that Del_Cp_r = 0, but it is
not a big deal to extend the equations to include heat capacities as well.

Del_G_r_T = Del_H_r_298 – T Del_S_r_298

Del_G_r_T = – R T ln Kp

When Kp, total pressure and the initial number of moles are given, it is 
rather straightforward to compute equilibrium composition. If you need 
help, please just let me know.

———–




On Feb 5, 2012, at 12:28 PM, Evgenii Rudnyi 
wrote:


On 05.02.2012 17:16 Evgenii Rudnyi said the following:

On 24.01.2012 22:56 meekerdb said the following:


In thinking about how to answer this I came across an
excellent paper by Roman Frigg and Charlotte Werndl
http://www.romanfrigg.org/writings/EntropyGuide.pdf which
explicates the relation more comprehensively than I could and
which also gives some historical background and extensions:
specifically look at section 4.

Brent


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Entropy: A Guide for the Perplexed

2012-02-06 Thread Evgenii Rudnyi

On 05.02.2012 22:33 Russell Standish said the following:

On Sun, Feb 05, 2012 at 07:28:47PM +0100, Evgenii Rudnyi wrote:

The most funny it looks in the conclusion

p. 28(142) "First, all notions of entropy discussed in this essay,
except the thermodynamic and the topological entropy, can be
understood as variants of some information-theoretic notion of
entropy."

I understand it this way. When I am working with gas, liquid or
solid at the level of experimental thermodynamics, the information
according to the authors is not there (at this point I am in
agreement with them). Yet, as soon as theoretical physicists start
thinking about these objects, they happen to be fully filled with
information.

Evgenii


Would you say that thermodynamic entropy is the same as the
Bolztmann-Gibbs formulation? If not, then why not? You will
certainly be arguing against orthodoxy, but you're welcome to try.


There is some difference between the entropy and classical and 
statistical thermodynamics. I will copy my old text to describe it.


In order to explain you this, let us consider a simple experiment. We 
bring a glass of hot water in the room and leave it there. Eventually 
the temperature of the water will be equal to the ambient temperature. 
In classical thermodynamics, this process is considered as irreversible, 
that is, the Second Law forbids that the temperature in the glass will 
be hot again spontaneously. It is in complete agreement with our 
experience, so one would expect the same from statistical mechanics. 
However there the entropy has some statistical meaning and there is a 
nonzero chance that the water will be hot again. Moreover, there is a 
theorem (Poincaré recurrence) that states that if we wait long enough 
then the temperature of the glass must be hot again.


Otherwise, they are the same. This does not mean however that the 
information come into the play in the Boltzmann-Gibbs formulation. You 
have missed my comment to this, hence I will repeat it.


On 05.02.2012 19:28 Evgenii Rudnyi said the following:
...
> I have browsed the paper. I should say that I am not impressed. The
> logic is exactly the same as in other papers and books.
>
> I have nothing against the Shannon entropy (Section 3 in the paper).
>  Shannon can use the term entropy (why not) but then we should just
> distinguish between the informational entropy and the thermodynamic
> entropy as they have been introduced for completely different
> problems.
>
> The logic that both entropies are the same is in Section 4 and it is
>  expressed bluntly as
>
> p. 13 (127) "Then, if we regard the w_i as messages, S_B,m(M_i) is
> equivalent to the Shannon entropy up to the multiplicative constant
> nk and the additive constant C."
>
> p. 15 (129) "The most straightforward connection is between the Gibbs
>  entropy and the continuous Shannon entropy, which differ only by the
>  multiplicative constant k."
>
> Personally I find this clumsy. In my view, the same mathematical
> structure of equations does not say that the phenomena are related.
> For example the Poisson equation for electrostatics is mathematically
>  equivalent to the stationary heat conduction equation. So what?
> Well, one creative use is for people who have a thermal FEM solver
> and do not have an electrostatic solver. They can solve an
> electrostatic problem by using a thermal FEM solver by means of
> mathematical analogy. This does happen but I doubt that we could
> state that the stationary heat conduction is equivalent to
> electrostatics.

In my option, the similarity of mathematical equations does not mean 
that the phenomena are the same. Basically it is about definitions. If 
you define the information through the Shannon entropy, it is okay. You 
have however to prove that the Shannon entropy is the same as the 
thermodynamic entropy. In this respect, the similarity of equations, in 
my view, is a weak argument.


Do you have anything else to support that the thermodynamic entropy is 
information except that the two equations are similar with each other?


Evgenii



If you agree that it is the same, then surely you can see that
information and entropy are related - they are both the logarithm of
a probability - in the case of Boltzmann it is the logarithm of the
number of possible microstates multiplied by the probability of the
thermodynamic state.

Are you aware of the result relating the Kolmogorov "program length"
complexity measure to the logarithm of the probability of that
program appearing in the universal prior?

Both are examples of information, but measured in different
contexts.

I will comment on the entropy context of the JANAF tables in another
post. Essentially you are asserting that the context of those tables
is the only context under which thermodynamic entropy makes sense.
All other contexts for which there is an entropy-like quantity do
not count, and those measures should be called something else. A
variety of information, or complexit

Re: Entropy: A Guide for the Perplexed

2012-02-06 Thread Jason Resch
I think entropy is better intuitively understood as uncertanty.  The  
entropy of a gas is the uncertanty of the particle positions and  
velocities.  The hotter it is the more uncertanties there are.  A  
certain amount of information is required to eliminate this uncertanty.


Jason

On Feb 5, 2012, at 12:28 PM, Evgenii Rudnyi  wrote:


On 05.02.2012 17:16 Evgenii Rudnyi said the following:

On 24.01.2012 22:56 meekerdb said the following:


In thinking about how to answer this I came across an excellent
paper by Roman Frigg and Charlotte Werndl
http://www.romanfrigg.org/writings/EntropyGuide.pdf which
explicates the relation more comprehensively than I could and which
also gives some historical background and extensions: specifically
look at section 4.

Brent


I have browsed the paper. I should say that I am not impressed. The  
logic is exactly the same as in other papers and books.


I have nothing against the Shannon entropy (Section 3 in the paper).  
Shannon can use the term entropy (why not) but then we should just  
distinguish between the informational entropy and the thermodynamic  
entropy as they have been introduced for completely different  
problems.


The logic that both entropies are the same is in Section 4 and it is  
expressed bluntly as


p. 13 (127) "Then, if we regard the w_i as messages, S_B,m(M_i) is  
equivalent to the Shannon entropy up to the multiplicative constant  
nk and the additive constant C."


p. 15 (129) "The most straightforward connection is between the  
Gibbs entropy and the continuous Shannon entropy, which differ only  
by the multiplicative constant k."


Personally I find this clumsy. In my view, the same mathematical  
structure of equations does not say that the phenomena are related.  
For example the Poisson equation for electrostatics is  
mathematically equivalent to the stationary heat conduction  
equation. So what? Well, one creative use is for people who have a  
thermal FEM solver and do not have an electrostatic solver. They can  
solve an electrostatic problem by using a thermal FEM solver by  
means of mathematical analogy. This does happen but I doubt that we  
could state that the stationary heat conduction is equivalent to  
electrostatics.


The most funny it looks in the conclusion

p. 28(142) "First, all notions of entropy discussed in this essay,  
except the thermodynamic and the topological entropy, can be  
understood as variants of some information-theoretic notion of  
entropy."


I understand it this way. When I am working with gas, liquid or  
solid at the level of experimental thermodynamics, the information  
according to the authors is not there (at this point I am in  
agreement with them). Yet, as soon as theoretical physicists start  
thinking about these objects, they happen to be fully filled with  
information.


Evgenii


Brent,

I have started reading the pdf. A few comments to section 2 Entropy
in thermodynamics.

The authors seem to be sloppy.

1) p. 2 (116). "If we consider a cyclical process—a process in wh 
ich

the beginning and the end state are the same — a reversible process
leaves the system and its surroundings unchanged."

This is wrong, as one runs the Carnot cycle reversibly, then the heat
will be converted to work (or vice versa) and there will be changes
in the surroundings. They probably mean that if one runs the Carnot
cycle reversibly twice, first in one direction and then in the
opposite, then the surrounding will be unchanged.

2) p. 2(116). "We can then assign an absolute entropy value to every
state of the system by choosing one particular state A (we can
choose any state we please!) as the reference point."

They misuse the conventional terminology. The absolute entropy is
defined by the Third Law and they just want employ S instead of Del
S. It is pretty dangerous, as when one changes the working body in
the Carnot cycle, then such a notation will lead to a catastrophe.

3) p.3(117). "If we now restrict attention to adiathermal processes
(i.e. ones in which temperature is constant),"

According to Eq 4 that they discuss they mean an adiabatic process
where temperature is not constant.

However, at the end of this small section they write

p. 3(117). "S_TD has no intuitive interpretation as a measure of
disorder, disorganization, or randomness (as is often claimed). In
fact such considerations have no place in TD."

I completely agree with that, so I am going to read further.

Evgenii



--
You received this message because you are subscribed to the Google  
Groups "Everything List" group.

To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en 
.




--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything

Re: Entropy: A Guide for the Perplexed

2012-02-05 Thread Russell Standish
On Sun, Feb 05, 2012 at 07:28:47PM +0100, Evgenii Rudnyi wrote:
> The most funny it looks in the conclusion
> 
> p. 28(142) "First, all notions of entropy discussed in this essay,
> except the thermodynamic and the topological entropy, can be
> understood as variants of some information-theoretic notion of
> entropy."
> 
> I understand it this way. When I am working with gas, liquid or
> solid at the level of experimental thermodynamics, the information
> according to the authors is not there (at this point I am in
> agreement with them). Yet, as soon as theoretical physicists start
> thinking about these objects, they happen to be fully filled with
> information.
> 
> Evgenii

Would you say that thermodynamic entropy is the same as the
Bolztmann-Gibbs formulation? If not, then why not? You will certainly
be arguing against orthodoxy, but you're welcome to try.

If you agree that it is the same, then surely you can see that
information and entropy are related - they are both the logarithm of a
probability - in the case of Boltzmann it is the logarithm of the
number of possible microstates multiplied by the probability of the
thermodynamic state.

Are you aware of the result relating the Kolmogorov "program length"
complexity measure to the logarithm of the probability of that program
appearing in the universal prior?

Both are examples of information, but measured in different contexts.

I will comment on the entropy context of the JANAF tables in another
post. Essentially you are asserting that the context of those tables
is the only context under which thermodynamic entropy makes sense. All
other contexts for which there is an entropy-like quantity do not
count, and those measures should be called something else. A variety
of information, or complexity perhaps.

Alternatively, we could recognise the modern understanding that these
terms are all essentially equivalent, but that they refer to a family
of measures that vary depending on the context.

It comes down to a terminological argument, sure, but your insistence
that thermodynamic entropy is a special case strikes me as a baroque
means of hiding the thermodynamic context - one that doesn't engender
understanding of the topic.

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Entropy: A Guide for the Perplexed

2012-02-05 Thread Evgenii Rudnyi

On 05.02.2012 17:16 Evgenii Rudnyi said the following:

On 24.01.2012 22:56 meekerdb said the following:


In thinking about how to answer this I came across an excellent
paper by Roman Frigg and Charlotte Werndl
http://www.romanfrigg.org/writings/EntropyGuide.pdf which
explicates the relation more comprehensively than I could and which
also gives some historical background and extensions: specifically
look at section 4.

Brent


I have browsed the paper. I should say that I am not impressed. The 
logic is exactly the same as in other papers and books.


I have nothing against the Shannon entropy (Section 3 in the paper). 
Shannon can use the term entropy (why not) but then we should just 
distinguish between the informational entropy and the thermodynamic 
entropy as they have been introduced for completely different problems.


The logic that both entropies are the same is in Section 4 and it is 
expressed bluntly as


p. 13 (127) "Then, if we regard the w_i as messages, S_B,m(M_i) is 
equivalent to the Shannon entropy up to the multiplicative constant nk 
and the additive constant C."


p. 15 (129) "The most straightforward connection is between the Gibbs 
entropy and the continuous Shannon entropy, which differ only by the 
multiplicative constant k."


Personally I find this clumsy. In my view, the same mathematical 
structure of equations does not say that the phenomena are related. For 
example the Poisson equation for electrostatics is mathematically 
equivalent to the stationary heat conduction equation. So what? Well, 
one creative use is for people who have a thermal FEM solver and do not 
have an electrostatic solver. They can solve an electrostatic problem by 
using a thermal FEM solver by means of mathematical analogy. This does 
happen but I doubt that we could state that the stationary heat 
conduction is equivalent to electrostatics.


The most funny it looks in the conclusion

p. 28(142) "First, all notions of entropy discussed in this essay, 
except the thermodynamic and the topological entropy, can be understood 
as variants of some information-theoretic notion of entropy."


I understand it this way. When I am working with gas, liquid or solid at 
the level of experimental thermodynamics, the information according to 
the authors is not there (at this point I am in agreement with them). 
Yet, as soon as theoretical physicists start thinking about these 
objects, they happen to be fully filled with information.


Evgenii


Brent,

I have started reading the pdf. A few comments to section 2 Entropy
in thermodynamics.

The authors seem to be sloppy.

1) p. 2 (116). "If we consider a cyclical process—a process in which
the beginning and the end state are the same — a reversible process
leaves the system and its surroundings unchanged."

This is wrong, as one runs the Carnot cycle reversibly, then the heat
 will be converted to work (or vice versa) and there will be changes
in the surroundings. They probably mean that if one runs the Carnot
cycle reversibly twice, first in one direction and then in the
opposite, then the surrounding will be unchanged.

2) p. 2(116). "We can then assign an absolute entropy value to every
 state of the system by choosing one particular state A (we can
choose any state we please!) as the reference point."

They misuse the conventional terminology. The absolute entropy is
defined by the Third Law and they just want employ S instead of Del
S. It is pretty dangerous, as when one changes the working body in
the Carnot cycle, then such a notation will lead to a catastrophe.

3) p.3(117). "If we now restrict attention to adiathermal processes
(i.e. ones in which temperature is constant),"

According to Eq 4 that they discuss they mean an adiabatic process
where temperature is not constant.

However, at the end of this small section they write

p. 3(117). "S_TD has no intuitive interpretation as a measure of
disorder, disorganization, or randomness (as is often claimed). In
fact such considerations have no place in TD."

I completely agree with that, so I am going to read further.

Evgenii



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.