On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:
> Russell,
> 
> I know that many physicists identify the entropy with information.
> Recently I had a nice discussion on biotaconv and people pointed out
> that presumably Edwin T. Jaynes was the first to make such a
> connection (Information theory and statistical mechanics, 1957).
> Google Scholar shows that his paper has been cited more than 5000
> times, that is impressive and it shows indeed that this is in a way
> mainstream.

Because I tend to think of "negentropy", which is really another term
for information, I tend to give priority to Schroedinger who wrote
about the topic in the early 40s. But Jaynes was certainly
instrumental in establishing the information based foundations to
statistical physics, even before information was properly defined (it
wasn't really until the likes of Kolmogorov, Chaitin and Solomonoff in
the 60s that information was really understood.

But Landauer in the late 60s was probably the first to make physicists
really wake up to the concept of physical information.

But then, I'm not a science historian, so what would I know :).

> 
> I have studied Jaynes papers but I have been stacked with for example
> 

... snip ...

> 
> Basically I do not understand what the term information then brings.
> One can certainly state that information is the same as the entropy
> (we are free with definitions after all). Yet I miss the meaning of
> that. Let me put it this way, we have the thermodynamic entropy and
> then the informational entropy as defined by Shannon. The first used
> to designe a motor and the second to design a controller. Now let us
> suppose that these two entropies are the same. What this changes in
> a design of a motor and a controller? In my view nothing.
> 

I can well recommend Denbigh & Denbigh's book from the 80s - its a bit
more of a modern understanding of the topic than Jaynes :)

@book{Denbigh-Denbigh87,
   author = {Denbigh, K. G. and Denbigh, J.},
   publisher = { Cambridge UP},
   title = { Entropy in Relation to Incomplete Knowledge},
   year = { 1987},
}


> By the way, have you seen the answer to my question:
> 
> >> Also remember that at constant volume dS = (Cv/T) dT and dU =
> >> CvdT. If the entropy is information then its derivative must be
> >> related to information as well. Hence Cv must be related to
> >> information. This however means that the energy also somehow
> >> related to information.
> 
> If the entropy is the same as information, than through the
> derivatives all thermodynamic properties are related to information
> as well. I am not sure if this makes sense in respect for example to
> design a self-driving car.
> 

The information embodied in the thermodynamic state is presumably not
relevant to the design of a self-driving car. By the same token,
thermodynamic treatment (typically) discards a lot of information
useful for engineering.

> I am aware of works that estimated the thermodynamic limit (kT) to
> process information. I do not see however, how this proves the
> equivalence of information and entropy.
> 
> Evgenii
> 
> P.S. For a long time, people have identified the entropy with chaos.
> I have recently read a nice book to this end, Entropy and Art by
> Arnheim, 1971, it is really nice. One quote:
> 

I guess this is the original meaning of chaos, not the more modern
meaning referring to "low dimension dynamical systems having strange
attractors". 

> "The absurd consequences of neglecting structure but using the
> concept of order just the same are evident if one examines the
> present terminology of information theory. Here order is described
> as the carrier of information, because information is defined as the
> opposite of entropy, and entropy is a measure of disorder. To
> transmit information means to induce order. This sounds reasonable
> enough. Next, since entropy grows with the probability of a state of
> affairs, information does the opposite: it increases with its
> improbability. The less likely an event is to happen, the more
> information does its occurrence represent. This again seems
> reasonable. Now what sort of sequence of events will be least
> predictable and therefore carry a maximum of information? Obviously
> a totally disordered one, since when we are confronted with chaos we
> can never predict what will happen next. 

This rather depends on whether the disorder is informationally
significant. This is context dependent. I have a discussion on this
(it relates to the Kolmogorov idea that random sequences have maximum
complexity) in my paper "On Complexity and Emergence". I also touch on
the theme in my book "Theory of Nothing", which I know you've read!

> The conclusion is that
> total disorder provides a maximum of information; 

Total disorder corresponds to a maximum of entropy. Maximum entropy
minimises the amount of information.

> and since
> information is measured by order, a maximum of order is conveyed by
> a maximum of disorder. Obviously, this is a Babylonian muddle.
> Somebody or something has confounded our language."
> 

I would say it is many people, rather than just one. I wrote "On
Complexity and Emergence" in response to the amount of unmitigated
tripe I've seen written about these topics.


-- 

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      hpco...@hpcoders.com.au
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to