On Wed, Jan 18, 2012 at 08:13:07PM +0100, Evgenii Rudnyi wrote:
> On 18.01.2012 18:47 John Clark said the following:
> >On Sun, Jan 15, 2012 at 3:54 PM, Evgenii Rudnyi<use...@rudnyi.ru>
> >wrote:
> >
> >" Some physicists say that information is related to the entropy"
> >>
> >
> >That is incorrect, ALL physicists say that information is related to
> >entropy. There are quite a number of definitions of entropy, one I
> >like, although not as rigorous as some it does convey the basic idea:
> >entropy is a measure of the number of ways the microscopic structure
> >of something can be changed without changing the macroscopic
> >properties. Thus, the living human body has very low entropy because
> >there are relatively few changes that could be made in it without a
> >drastic change in macroscopic properties, like being dead; a bucket
> >of water has a much higher entropy because there are lots of ways you
> >could change the microscopic position of all those water molecules
> >and it would still look like a bucket of water; cool the water and
> >form ice and you have less entropy because the molecules line up into
> >a orderly lattice so there are fewer changes you could make. The
> >ultimate in high entropy objects is a Black Hole because whatever is
> >inside one on the outside any Black Hole can be completely described
> >with just 3 numbers, its mass, spin and electrical charge.
> >
> >John K Clark
> >
> If you look around you may still find species of scientists who
> still are working with classical thermodynamics (search for example
> for CALPHAD). Well, if you refer to them as physicists or not, it is
> your choice. Anyway in experimental thermodynamics people determine
> entropies, for example from CODATA tables
> http://www.codata.org/resources/databases/key1.html
> S ° (298.15 K)
> J K-1 mol-1
> Ag  cr  42.55 ą 0.20
> Al  cr  28.30 ą 0.10
> Do you mean that 1 mole of Ag has more information than 1 mole of Al
> at 298.15 K?
> Also remember that at constant volume dS = (Cv/T) dT and dU = CvdT.
> If the entropy is information then its derivative must be related to
> information as well. Hence Cv must be related to information. This
> however means that the energy also somehow related to information.
> Finally, the entropy is defined by the Second Law and the best would
> be to stick to this definition. Only in this case, it is possible to
> understand what we are talking about.
> Evgenii
> -- 
> http://blog.rudnyi.ru

Evgenii, while you may be right that some physicists (mostly
experimentalists) work in thermodynamics without recourse to the
notion of information, and chemists even more so, it is also true that
the modern theoretical understanding of entropy (and indeed
thermodynamics) is information-based.

This trend really became mainstream with Landauer's work demonstrating
thermodynamic limits of information processing in the 1960s, which
turned earlier speculations by the likes of Schroedinger and Brillouin
into something that couldn't be ignored, even by experimentalists.

This trend of an information basis to physics has only accelerated
in my professional lifetime - I've seen people like Hawking discuss
information processing of black holes, and we've see concepts like the
Beckenstein bound linking geometry of space to information capacity.

David Deutsch is surely backing a winning horse to point out that
algorithmic information theory must be a foundational strand of the
"fabric of reality".



Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      hpco...@hpcoders.com.au
University of New South Wales          http://www.hpcoders.com.au

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to