On 27.02.2012 00:13 meekerdb said the following:
On 2/26/2012 5:58 AM, Evgenii Rudnyi wrote:
I have written a summary for the discussion in the subject:

http://blog.rudnyi.ru/2012/02/entropy-and-information.html

No doubt, this is my personal viewpoint. If you see that I have
missed something, please let me know.

I think you are ignoring the conceptual unification provided by
information theory and statistical mechanics. JANAF tables only
consider the thermodynamic entropy, which is a special case in which
the macroscopic variables are temperature and pressure. You can't
look up the entropy of magnetization in the JANAF tables.

I do not get your point. JANAF Tables have been created to solve a particular problem. If you need change in concentration, surface effects, magnetization effects, you have to extend the JANAF Tables. And this has been to solve particular problems. Experimental thermodynamics is not limited to JANAF Tables. For example, the databases in Thermocalc already include dependence on concentration.

Yet
magnetization of small domains is how information is stored on hard
disks, c.f. Donald McKay's book "Information Theory, Inference, and
Learning Algorithm" chapter 31.

Do you mean that when we consider magnetization, then the entropy become subjective, context-dependent, and it will be finally filled with information?

Did you actually read E. T. Jaynes 1957 paper in which he introduced
the idea of basing entropy in statistical mechanics (which you also
seem to dislike) on information? He wrote "The mere fact that the
same mathematical expression -SUM[p_i log(p_i)] occurs in both
statistical mechanics and in information theory does not in itself
establish a connection between these fields. This can be done only by
finding new viewpoints from which the thermodynamic entropy and
information-theory entropy appear as the same /concept/." Then he

I have missed this quote, I have to add it. In general, the first Jaynes's paper is in a way reasonable. I wanted to better understand it, as I like maximum likelihood, I have been using it in my own research a lot. However, when I have read in Jaynes's second paper the following (two quotes below), I gave up.

“With such an interpretation the expression “irreversible process” represents a semantic confusion; it is not the physical process that is irreversible, but rather our ability to follow it. The second law of thermodynamics then becomes merely the statement that although our information as to the state of a system may be lost in a variety of ways, the only way in which it can be gained is by carrying out further measurements.”

“It is important to realize that the tendency of entropy to increase is not a consequence of the laws of physics as such, … . An entropy increase may occur unavoidably, due to our incomplete knowledge of the forces acting on a system, or it may be entirely voluntary act on our part.”

This I do not understand. Do you agree with these two quotes? If yes, could you please explain, what he means?

goes on to show how the principle of maximum entropy can be used to
derive statistical mechanics. That it *can* be done in some other
way, and was historically as you assert, is not to the point. As an
example of how the information view of statistical mechanics extends
its application he calculates how much the spins of protons in water
would be polarized by rotating the water at 36,000rpm. It seems you
are merely objecting to "new viewpoints" on the grounds that you can
see all that you /want/ to see from the old viewpoint.

Your quotation of Arnheim, from his book on the theory of entropy in
 art, just shows his confusion. The Shannon information, which is
greatest when the system is most disordered in some sense, does not
imply that the most disordered message contains the greatest
information. The Shannon information is that information we receive
when the *potential messages* are most disordered. It's a property of
an ensemble or a channel, not of a particular message.

It is not a confusion of Arnheim. His book is quite good. To this end, let me quote your second sentence in your message.

> I think you are ignoring the conceptual unification provided by
> information theory and statistical mechanics.

You see, I would love to understand the conceptual unification. To this end, I have created many simple problems to understand this better. Unfortunately you do not want to discuss them, you just saying general words but you do not want to apply it to my simple practical problems. Hence it is hard for me to understand you.

If to speak about confusion, just one example. You tell that the higher temperature the more information the system has. Yet, engineers seems to be unwilling to employ this knowledge in practice. Why is that? Why engineers seem not to be impressed by the conceptual unification?

Evgenii

Brent


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to