On 1/27/2012 2:51 PM, Craig Weinberg wrote:
On Jan 27, 11:42 am, John Clark<johnkcl...@gmail.com>  wrote:
On Thu, Jan 26, 2012  Craig Weinberg<whatsons...@gmail.com>  wrote:

If a bucket of water has more of it than DNA, then the word information
is meaningless.
You would need to send more, far far more, dots and dashes down a wire to
inform a intelligent entity what the position and velocity of every
molecule in bucket of water is than to inform it exactly what the human
genome is.
It depends what kind of compression you are using. You could much more
easily write a probabilistic equation to simulate any given volume of
water than the same volume of DNA, especially when you get into
secondary and tertiary structure.

Now what word didn't you understand.
Condescension doesn't impress me. I understand your words perfectly,
it's just that what they are saying seems to be incorrect.

A symphony then would have less information and more entropy than random
noise.
No, a symphony would have less information but LESS entropy than random
white noise.
Ok, I think I see what the confusion is. We are operating not only
different definitions of entropy but different assumptions about the
universe which directly relate to information.

This Q&A: 
http://stackoverflow.com/questions/651135/shannons-entropy-formula-help-my-confusion
was the only page I could find that was written simply enough to make
sense to me. Your definition assumes that the universe is a platform
for encoding and decoding information and mine does not. You are
talking about entropy in terms of resistance to compression of
redundancy. Ok, but the relationship of Shannon entropy and
thermodynamic entropy is not what you are implying it is. The Wiki was
helpful: http://en.wikipedia.org/wiki/Entropy_(information_theory)

"At an everyday practical level the links between information entropy
and thermodynamic entropy are not evident. Physicists and chemists are
apt to be more interested in changes in entropy as a system
spontaneously evolves away from its initial conditions, in accordance
with the second law of thermodynamics, rather than an unchanging
probability distribution. And, as the minuteness of Boltzmann's
constant kB indicates, the changes in S / kB for even tiny amounts of
substances in chemical and physical processes represent amounts of
entropy which are so large as to be off the scale compared to anything
seen in data compression or signal processing. Furthermore, in
classical thermodynamics the entropy is defined in terms of
macroscopic measurements and makes no reference to any probability
distribution, which is central to the definition of information
entropy.

But, at a multidisciplinary level, connections can be made between
thermodynamic and informational entropy, although it took many years
in the development of the theories of statistical mechanics and
information theory to make the relationship fully apparent. In fact,
in the view of Jaynes (1957), thermodynamic entropy, as explained by
statistical mechanics, should be seen as an application of Shannon's
information theory: the thermodynamic entropy is interpreted as being
proportional to the amount of further Shannon information needed to
define the detailed microscopic state of the system, that remains
uncommunicated by a description solely in terms of the macroscopic
variables of classical thermodynamics, with the constant of
proportionality being just the Boltzmann constant."

The key phrase for me here is "the thermodynamic entropy is
interpreted as being proportional to the amount of further Shannon
information needed to define the detailed microscopic state of the
system". This confirms what I have been saying and is the opposite of
what you are saying. Thermodynamic entropy is proportional to the
amount of Shannon information *needed* to (encode/compress/extract
redundancy) from a given description to arrive at a maximally
compressed description. The more entropy or patternlessness you have,
ie the more equilibrium of probability and lack of redundancy you
have, the less information you have and the more Shannon information
you need to avoid lossy compression.

This means that DNA, having low entropy compared with pure water, has
high pattern content, high information, and less Shannon information
is required to describe it. Easier to compress does *not* mean less
information,

You're switching meanings of "information". Something highly compressible, like, "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" doesn't convey much information in either the colloquial or Shannon sense. I think it's important to keep in mind that these measures of information are relative to some context. Removed from it's cellular environment, the code for a strand of DNA would not convey much information in the colloquial sense, but its Shannon information would be the same.


it means more information is present already because in
essence the job is already partially done for you. Shannon entropy
then, is a measure of drag on compression, a figurative use of the
term entropy for the specific purpose of encoding and decoding. I am
using the literal thermodynamic sense of entropy,

You mean the integrating variable S in TdS=dQ?

as well as the
figurative vernacular sense of entropy as degradation of order or
coherence, both of these are loosely inversely proportional to Shannon
entropy.

No; more varied strings, with less internal correlation, more random looking, convey more information.

The compressibility of a novel or picture does not relate to
the quality of information, not to mention qualities of significance.
Weighing art by the pound is not a serious way to approach a theory
about consciousness or qualia.


That's why lossless computer image and sound compression
programs don't work with white noise, there is no redundancy to remove
because white noise has no redundancy.  It would take many more dots and
dashes sent down a wire to describe every pop and click in a piece of white
noise than to describe a symphony of equal length.
Yes, I see what you mean. I had not heard of Shannon information. It's
an excellent tool for working with statistical data, but tells us
nothing about what information actually is or does.

It does so long as you keep the context in mind.

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to