On 2/20/2012 12:02 PM, Evgenii Rudnyi wrote:
On 20.02.2012 19:54 meekerdb said the following:
On 2/20/2012 10:33 AM, Evgenii Rudnyi wrote:
On 19.02.2012 22:13 Russell Standish said the following:
On Sun, Feb 19, 2012 at 11:21:01AM -0500, John Clark wrote:
On Sun, Feb 19, 2012 at 5:32 AM, Evgenii
Rudnyi<use...@rudnyi.ru> wrote:


If one well defines a thought experiment with the
Maxwell's demon, then
it is quite clear that such thing does not exist. Why then
to spend on it so much time?


Maxwell's demon is possible in classical physics and it was
not clear that quantum mechanics made it impossible until 1929
when Leo Szilard proved that to be the case. And understanding
just why it can not exist aids in understanding the
relationship between energy information entropy and
reversibility. Maxwell's demon was the starting point for Rolf
Landauer's discovery in 1960 that erasing information always
requires energy and increases entropy because it's
thermodynamically irreversible.


Good answer John. Does anyone want to pick on Evgeni's comments
about Chris Adami's book?

It weird, because Chris's book gives some of the bext examples
of the application of statistical physics to artificial life. In
particular, his observation that mutation should play an
analogous role to temperature in an evolutionary process, and
that several evolutionary regimes exist as mutation is varied,
corresponding to phase transitions in materials.

I have nothing against Adami's book as such. His description of his
 software avida and his experiments with it are okay. My point was
 about his claim that his work has something to do with
thermodynamics. It is definitely not. The thermodynamic entropy is
not there. The quotes from the book displays this pretty clear.

You have written about "an analogous role". I would not object if
you say that there is an analogy between the thermodynamic entropy
and information. Yet, I am against the statement that the
thermodynamic entropy is information and I believe that I have
given many examples that show this.

What you are overlooking is that information is *about* things. So
entropy in thermodynamics is information about the system's location
in phase space. That's what connects "information" and "work" and
"temperature". Entropy in communication theory is about the location
of a message in message space. It's a different application of the
same concept. The two overlap when considering the minimum free
energy requirements of a physical realization of a computation - but
existing computers operate far above those minimums so the overlap is
only of theoretical interest.

What is left is to apply your concept to examples in practice. Then it would be more clear what you mean. Let me repeat just one question that you have not answered yet (but I believe that I have given much more examples and they have not been worked out).

The only example of the entropy used by engineers in informatics has been given by Jason and I will quote him below. Could you please tell me, the thermodynamic entropy of what is discussed in his example?

I am ready to learn the meaning of information in thermodynamics. Please just explain it by means of practical examples.


The link I sent below works out the entropy of an ideal gas using information. You keep asking for "practical examples" but that's like asking for practical examples of calculating molecular reaction free energy from quantum mechanics. It is very difficult because it depends on the electron energy levels. It has been done in a few simple (not necessarily practical) cases as a proof of principle. But it is not the way engineering or chemistry is done because it is both easier and more reliable to measure them. But that doesn't mean that they don't have energy or that the concept of energy doesn't apply. No one calculates the strength of steel from carbon and iron atomic bonds and crystal structure either. But that doesn't mean the strength of steel is a separate, independent property.

I personally do not see thermodynamics in the Jason's work. Please just explain what I am missing.


On 03.02.2012 00:14 Jason Resch said the following:

…
> Evgenii,
>
> Sure, I could give a few examples as this somewhat intersects with my
> line of work.
>
> The NIST 800-90 recommendation (
> http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf )
> for random number generators is a document for engineers implementing
> secure pseudo-random number generators.  An example of where it is
> important is when considering entropy sources for seeding a random
> number generator.  If you use something completely random, like a
> fair coin toss, each toss provides 1 bit of entropy.  The formula is
> -log2(predictability).  With a coin flip, you have at best a .5
> chance of correctly guessing it, and -log2(.5) = 1.  If you used a
> die roll, then each die roll would provide -log2(1/6) = 2.58 bits of
> entropy.  The ability to measure unpredictability is necessary to
> ensure, for example, that a cryptographic key is at least as
> difficult to predict the random inputs that went into generating it
> as it would be to brute force the key.
>
> In addition to security, entropy is also an important concept in the
> field of data compression.  The amount of entropy in a given bit
> string represents the theoretical minimum number of bits it takes to
> represent the information.  If 100 bits contain 100 bits of entropy,
> then there is no compression algorithm that can represent those 100
> bits with fewer than 100 bits.  However, if a 100 bit string contains
> only 50 bits of entropy, you could compress it to 50 bits.  For
> example, let’s say you had 100 coin flips from an unfair coin.  This
> unfair coin comes up heads 90% of the time.  Each flip represents
> -log2(.9) = 0.152 bits of entropy.  Thus, a sequence of 100 coin
> flips with this biased coin could be represent with 16 bits.  There
> is only 15.2 bits of information / entropy contained in that 100 bit
> long sequence.

Thermodynamic entropy is not subjective and not context dependent*,
so my claim is that Adami does not understand what the
thermodynamic entropy is. He has never taken a class in
experimental thermodynamics, this is the problem.

I'm beginning to think you have never taken a class in statistical
mechanics. There's a good online course here:

I have done a class in statistical thermodynamics. Actually it was a pretty good class where different approaches of Boltzmann, Gibbs and other have been considered in detail.

The difference is that I do not believe that a similar equation in different areas imply that the different things are the same.

If you would like to show that information is very useful in thermodynamics,

Other, smarter people have already done that.

please apply it to simple thermodynamic problems to show how the concept of information has simplified for example the computation of the phase diagram (or equilibrium composition between N2, H2 and NH3). Should I repeat my examples?

No, you should consider why chemists don't just calculate all reactions and structure from atomic theory and QM.

Brent


Evgenii



http://farside.ph.utexas.edu/teaching/sm1/lectures/lectures.html

Those particularly relevant to this thread start at

http://farside.ph.utexas.edu/teaching/sm1/lectures/node61.html

and go through the next six or seven.

Brent


* I would accept the notation that the entropy is context dependent
in a sense that its definition depends on the thermodynamics
theory. If we change the theory, then the entropy could have some
other meaning. But it seems not what you have meant.

Evgenii



This phenomena I have observed in my own evolutionary
experiments. Plus, it appears to be correlated to Mark Bedau's
evolutionary classes.

This is the paper I usually refer to, although his ideas have
evolved somewhat since 1998:

M. A. Bedau, E. Snyder, N. H. Packard. 1998. A Classification of
Long-Term Evolutionary Dynamics. In C. Adami, R. Belew, H.
Kitano, and C. Taylor, eds., Artificial Life VI, pp. 228-237.
Cambridge: MIT Press. Also published as Working Paper
No.98-03-025, Santa Fe Institute, Santa Fe, NM.

http://people.reed.edu/~mab/publications/papers/alife6.pdf








--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to