On Thu, Jun 14, 2012 Craig Weinberg <whatsons...@gmail.com> wrote:
> > There seems to be a lot of confusion resulting from
> > Shannon’s<http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf>use
> > of the term entropy, and conflating it with thermodynamic entropy. I
> maintain that Shannon Entropy™ is closer to the inverse of thermodynamic
> entropy than it is a synonym for it.
Physical entropy is a measure of the number of micro-states something can
be in without changing its macro-state. A bucket of water can be in many
many micro-states and yet the end result of them all would still look and
act like a plain old bucket of water. So if you wanted to know the
micro-state of that particular bucket over there, if you wanted to know the
position and momentum of every water molecule in the bucket it would take a
great deal information to distinguish that particular micro-state from the
huge number of states that the bucket could be in and still look the same,
far more information than DNA has in your body. The bucket has a lot of
entropy and a lot of information, although it is information that most
humans would consider spectacularly unimportant. If the bucket of water
froze the molecules would line up in a regular lattice so the ice bucket
would contain less entropy and less information than the water bucket
because fewer micro-states could produce the same macro-state; with ice you
would be less surprised about where a molecule is and mathematical entropy
is a measure of surprise.
That's why if you use a lossless compression program the output tends to
look like white noise. White noise has maximum entropy and maximum
information density; you could change it in a enormous number of ways and
it would still look like white noise.
John K Clark
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at