Russell,

I know that many physicists identify the entropy with information. Recently I had a nice discussion on biotaconv and people pointed out that presumably Edwin T. Jaynes was the first to make such a connection (Information theory and statistical mechanics, 1957). Google Scholar shows that his paper has been cited more than 5000 times, that is impressive and it shows indeed that this is in a way mainstream.

I have studied Jaynes papers but I have been stacked with for example

“With such an interpretation the expression “irreversible process” represents a semantic confusion; it is not the physical process that is irreversible, but rather our ability to follow it. The second law of thermodynamics then becomes merely the statement that although our information as to the state of a system may be lost in a variety of ways, the only way in which it can be gained is by carrying out further measurements.”

“It is important to realize that the tendency of entropy to increase is not a consequence of the laws of physics as such, … . An entropy increase may occur unavoidably, due to our incomplete knowledge of the forces acting on a system, or it may be entirely voluntary act on our part.”

This is above of my understanding. As I have mentioned, I do not buy it, I still consider the entropy as it has been defined by for example Gibbs.

Basically I do not understand what the term information then brings. One can certainly state that information is the same as the entropy (we are free with definitions after all). Yet I miss the meaning of that. Let me put it this way, we have the thermodynamic entropy and then the informational entropy as defined by Shannon. The first used to designe a motor and the second to design a controller. Now let us suppose that these two entropies are the same. What this changes in a design of a motor and a controller? In my view nothing.

By the way, have you seen the answer to my question:

>> Also remember that at constant volume dS = (Cv/T) dT and dU =
>> CvdT. If the entropy is information then its derivative must be
>> related to information as well. Hence Cv must be related to
>> information. This however means that the energy also somehow
>> related to information.

If the entropy is the same as information, than through the derivatives all thermodynamic properties are related to information as well. I am not sure if this makes sense in respect for example to design a self-driving car.

I am aware of works that estimated the thermodynamic limit (kT) to process information. I do not see however, how this proves the equivalence of information and entropy.

Evgenii

P.S. For a long time, people have identified the entropy with chaos. I have recently read a nice book to this end, Entropy and Art by Arnheim, 1971, it is really nice. One quote:

"The absurd consequences of neglecting structure but using the concept of order just the same are evident if one examines the present terminology of information theory. Here order is described as the carrier of information, because information is defined as the opposite of entropy, and entropy is a measure of disorder. To transmit information means to induce order. This sounds reasonable enough. Next, since entropy grows with the probability of a state of affairs, information does the opposite: it increases with its improbability. The less likely an event is to happen, the more information does its occurrence represent. This again seems reasonable. Now what sort of sequence of events will be least predictable and therefore carry a maximum of information? Obviously a totally disordered one, since when we are confronted with chaos we can never predict what will happen next. The conclusion is that total disorder provides a maximum of information; and since information is measured by order, a maximum of order is conveyed by a maximum of disorder. Obviously, this is a Babylonian muddle. Somebody or something has confounded our language."

--
http://blog.rudnyi.ru




On 18.01.2012 23:42 Russell Standish said the following:
On Wed, Jan 18, 2012 at 08:13:07PM +0100, Evgenii Rudnyi wrote:
On 18.01.2012 18:47 John Clark said the following:
On Sun, Jan 15, 2012 at 3:54 PM, Evgenii
Rudnyi<use...@rudnyi.ru> wrote:

" Some physicists say that information is related to the
entropy"


That is incorrect, ALL physicists say that information is related
to entropy. There are quite a number of definitions of entropy,
one I like, although not as rigorous as some it does convey the
basic idea: entropy is a measure of the number of ways the
microscopic structure of something can be changed without
changing the macroscopic properties. Thus, the living human body
has very low entropy because there are relatively few changes
that could be made in it without a drastic change in macroscopic
properties, like being dead; a bucket of water has a much higher
entropy because there are lots of ways you could change the
microscopic position of all those water molecules and it would
still look like a bucket of water; cool the water and form ice
and you have less entropy because the molecules line up into a
orderly lattice so there are fewer changes you could make. The
ultimate in high entropy objects is a Black Hole because whatever
is inside one on the outside any Black Hole can be completely
described with just 3 numbers, its mass, spin and electrical
charge.

John K Clark


If you look around you may still find species of scientists who
still are working with classical thermodynamics (search for
example for CALPHAD). Well, if you refer to them as physicists or
not, it is your choice. Anyway in experimental thermodynamics
people determine entropies, for example from CODATA tables

http://www.codata.org/resources/databases/key1.html

S ° (298.15 K) J K-1 mol-1

Ag  cr  42.55 ą 0.20 Al  cr  28.30 ą 0.10

Do you mean that 1 mole of Ag has more information than 1 mole of
Al at 298.15 K?

Also remember that at constant volume dS = (Cv/T) dT and dU =
CvdT. If the entropy is information then its derivative must be
related to information as well. Hence Cv must be related to
information. This however means that the energy also somehow
related to information.

Finally, the entropy is defined by the Second Law and the best
would be to stick to this definition. Only in this case, it is
possible to understand what we are talking about.

Evgenii -- http://blog.rudnyi.ru


Evgenii, while you may be right that some physicists (mostly
experimentalists) work in thermodynamics without recourse to the
notion of information, and chemists even more so, it is also true
that the modern theoretical understanding of entropy (and indeed
thermodynamics) is information-based.

This trend really became mainstream with Landauer's work
demonstrating thermodynamic limits of information processing in the
1960s, which turned earlier speculations by the likes of Schroedinger
and Brillouin into something that couldn't be ignored, even by
experimentalists.

This trend of an information basis to physics has only accelerated in
my professional lifetime - I've seen people like Hawking discuss
information processing of black holes, and we've see concepts like
the Beckenstein bound linking geometry of space to information
capacity.

David Deutsch is surely backing a winning horse to point out that
algorithmic information theory must be a foundational strand of the
"fabric of reality".

Cheers


--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to