On 20.01.2012 05:59 Russell Standish said the following:
On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:


...


Basically I do not understand what the term information then
brings. One can certainly state that information is the same as the
entropy (we are free with definitions after all). Yet I miss the
meaning of that. Let me put it this way, we have the thermodynamic
entropy and then the informational entropy as defined by Shannon.
The first used to designe a motor and the second to design a
controller. Now let us suppose that these two entropies are the
same. What this changes in a design of a motor and a controller? In
my view nothing.


I can well recommend Denbigh&  Denbigh's book from the 80s - its a
bit more of a modern understanding of the topic than Jaynes :)

@book{Denbigh-Denbigh87, author = {Denbigh, K. G. and Denbigh, J.},
publisher = { Cambridge UP}, title = { Entropy in Relation to
Incomplete Knowledge}, year = { 1987}, }

Thanks. On biotaconv they have recommended John Avery's "Information Theory and Evolution" but I think I have already satisfied my curiosity with Jaynes's two papers. My personal feeling is as follows:

1) The concept of information is useless in conventional thermodynamic problems. Let us take for example the Fe-C phase diagram

http://www.calphad.com/graphs/Metastable%20Fe-C%20Phase%20Diagram.gif

What information has to do with the entropies of the phases in this phase diagram? Do you mean that I find an answer in Denbigh's book?

2) If physicists say that information is the entropy, they must take it literally and then apply experimental thermodynamics to measure information. This however seems not to happen.

3) I am working with engineers developing mechatronics products. Thermodynamics (hence the entropy) is there as well as information. However, I have not met a practitioner yet who makes a connection between the entropy and information.


By the way, have you seen the answer to my question:

Also remember that at constant volume dS = (Cv/T) dT and dU =
CvdT. If the entropy is information then its derivative must
be related to information as well. Hence Cv must be related to
information. This however means that the energy also somehow
related to information.

If the entropy is the same as information, than through the
derivatives all thermodynamic properties are related to
information as well. I am not sure if this makes sense in respect
for example to design a self-driving car.


The information embodied in the thermodynamic state is presumably
not relevant to the design of a self-driving car. By the same token,
thermodynamic treatment (typically) discards a lot of information
useful for engineering.

Sorry, I do not understand what this means.

I am aware of works that estimated the thermodynamic limit (kT) to
process information. I do not see however, how this proves the
equivalence of information and entropy.

Evgenii

...

and since information is measured by order, a maximum of order is
conveyed by a maximum of disorder. Obviously, this is a Babylonian
muddle. Somebody or something has confounded our language."


I would say it is many people, rather than just one. I wrote "On
Complexity and Emergence" in response to the amount of unmitigated
tripe I've seen written about these topics.



I have found your work on archiv.org and I will look at it. Thank you for mentioning it.

Evgenii

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to