On 06.02.2012 21:10 meekerdb said the following:
On 2/6/2012 11:18 AM, Evgenii Rudnyi wrote:
On 05.02.2012 22:33 Russell Standish said the following:
On Sun, Feb 05, 2012 at 07:28:47PM +0100, Evgenii Rudnyi wrote:
The most funny it looks in the conclusion

p. 28(142) "First, all notions of entropy discussed in this
essay, except the thermodynamic and the topological entropy,
can be understood as variants of some information-theoretic
notion of entropy."

I understand it this way. When I am working with gas, liquid
or solid at the level of experimental thermodynamics, the
information according to the authors is not there (at this
point I am in agreement with them). Yet, as soon as theoretical
physicists start thinking about these objects, they happen to
be fully filled with information.

Evgenii

Would you say that thermodynamic entropy is the same as the
Bolztmann-Gibbs formulation? If not, then why not? You will
certainly be arguing against orthodoxy, but you're welcome to
try.

There is some difference between the entropy and classical and
statistical thermodynamics. I will copy my old text to describe
it.

In order to explain you this, let us consider a simple experiment.
We bring a glass of hot water in the room and leave it there.
Eventually the temperature of the water will be equal to the
ambient temperature. In classical thermodynamics, this process is
considered as irreversible, that is, the Second Law forbids that
the temperature in the glass will be hot again spontaneously. It is
in complete agreement with our experience, so one would expect the
same from statistical mechanics. However there the entropy has some
statistical meaning and there is a nonzero chance that the water
will be hot again. Moreover, there is a theorem (Poincaré
recurrence) that states that if we wait long enough then the
temperature of the glass must be hot again.

Otherwise, they are the same. This does not mean however that the
information come into the play in the Boltzmann-Gibbs formulation.
You have missed my comment to this, hence I will repeat it.

On 05.02.2012 19:28 Evgenii Rudnyi said the following: ...
I have browsed the paper. I should say that I am not impressed.
The logic is exactly the same as in other papers and books.

I have nothing against the Shannon entropy (Section 3 in the
paper). Shannon can use the term entropy (why not) but then we
should just distinguish between the informational entropy and the
thermodynamic entropy as they have been introduced for completely
different problems.

The logic that both entropies are the same is in Section 4 and it
is expressed bluntly as

p. 13 (127) "Then, if we regard the w_i as messages, S_B,m(M_i)
is equivalent to the Shannon entropy up to the multiplicative
constant nk and the additive constant C."

p. 15 (129) "The most straightforward connection is between the
Gibbs entropy and the continuous Shannon entropy, which differ
only by the multiplicative constant k."

Personally I find this clumsy. In my view, the same mathematical
structure of equations does not say that the phenomena are
related. For example the Poisson equation for electrostatics is
mathematically equivalent to the stationary heat conduction
equation. So what? Well, one creative use is for people who have
a thermal FEM solver and do not have an electrostatic solver.
They can solve an electrostatic problem by using a thermal FEM
solver by means of mathematical analogy. This does happen but I
doubt that we could state that the stationary heat conduction is
equivalent to electrostatics.

In my option, the similarity of mathematical equations does not
mean that the phenomena are the same. Basically it is about
definitions. If you define the information through the Shannon
entropy, it is okay. You have however to prove that the Shannon
entropy is the same as the thermodynamic entropy. In this respect,
the similarity of equations, in my view, is a weak argument.

It is not based just on the similarity of equations. The equations
are similar because they come from the same concepts. The Shannon
information measure of the amount of phase space available to a
system, given the value of some macro variables like temperature,
pressure,... is proportional to the statistical mechanical entropy of
the system. There are idealizations in the analysis, both on the
thermodynamic and on the statistical mechanics side. Among the
idealizations is the neglect of bulk shapes (e.g. the text stamped on
a coin) and collective motions (e.g. acoustic waves).

Brent,

I would suggest to look at the history briefly.

Statistical thermodynamics has been derived by Boltzmann and Gibbs and at that time there was no information in it. This lasted for quite awhile and many famous physicists have not found any information in statistical mechanics.

The information entropy has started with Shannon's work where he writes

"The form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics8 where pi is the probability of a system being in cell i of its phase space. H is then, for example, the H in Boltzmann s famous H theorem."

Yet, he just shows that the equation is similar but he does not make a
statement about the meaning of such a similarity, that is, he does not identifies his entropy as the thermodynamic entropy. He just uses the term, nothing more. Now we have two similar equations describing two different phenomena and such a state took again a while.

Now let me quote from Edwin T. Jaynes first paper

p. 622 after eq (2-3) (this is the Shannon equation) "Since this is just the expression for entropy as found in statistical mechanics, it will be called the entropy of the probability distribution p_i; henceforth we will consider the terms "entropy" and "uncertainty" as synonymous."

This is exactly the logic, that I have mentioned above and that is expressed in the paper you give me. As the two equations are the same, they describe the same phenomenon. In my view, this is clumsy and I have given example when the same mathematical equation describes two different physical phenomena.

If you talk about the same concept, let me ask you following. The only example of the entropy used by engineers in informatics has been given by Jason and I will quote him below. Could you please tell me, the thermodynamic entropy of what is discussed in his example?

Evgenii


On 03.02.2012 00:14 Jason Resch said the following:
...
> Evgenii,
>
> Sure, I could give a few examples as this somewhat intersects with my
> line of work.
>
> The NIST 800-90 recommendation (
> http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf )
> for random number generators is a document for engineers implementing
> secure pseudo-random number generators.  An example of where it is
> important is when considering entropy sources for seeding a random
> number generator.  If you use something completely random, like a
> fair coin toss, each toss provides 1 bit of entropy.  The formula is
> -log2(predictability).  With a coin flip, you have at best a .5
> chance of correctly guessing it, and -log2(.5) = 1.  If you used a
> die roll, then each die roll would provide -log2(1/6) = 2.58 bits of
> entropy.  The ability to measure unpredictability is necessary to
> ensure, for example, that a cryptographic key is at least as
> difficult to predict the random inputs that went into generating it
> as it would be to brute force the key.
>
> In addition to security, entropy is also an important concept in the
> field of data compression.  The amount of entropy in a given bit
> string represents the theoretical minimum number of bits it takes to
> represent the information.  If 100 bits contain 100 bits of entropy,
> then there is no compression algorithm that can represent those 100
> bits with fewer than 100 bits.  However, if a 100 bit string contains
> only 50 bits of entropy, you could compress it to 50 bits.  For
> example, let's say you had 100 coin flips from an unfair coin.  This
> unfair coin comes up heads 90% of the time.  Each flip represents
> -log2(.9) = 0.152 bits of entropy.  Thus, a sequence of 100 coin
> flips with this biased coin could be represent with 16 bits.  There
> is only 15.2 bits of information / entropy contained in that 100 bit
> long sequence.
>
> Jason
>





Brent


Do you have anything else to support that the thermodynamic entropy
is information except that the two equations are similar with each
other?

Evgenii


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to