On Jan 26, 1:01 pm, John Clark <johnkcl...@gmail.com> wrote:
> On Thu, Jan 19, 2012 at 5:28 PM, Craig Weinberg <whatsons...@gmail.com>wrote:
> > I thought that the whole point of information theory is to move beyond
> > quality into pure quantification.
> > > the suggestion that information can be defined as not having anything to
> > do with the difference between order and the absence of order is laughably
> > preposterous
> > The idea that a bucket of water has more 'information' than DNA is
> > meaningless.
> What word didn't you understand?
Information. If a bucket of water has more of it than DNA, then the
word information is meaningless.
> >> No, if its repeating then it would have less information, that is to
> >> say it would take less information to describe the result.
> > > Of course, but how does that jibe with the notion that information
> > ismolecular entropy? How does A-T A-T A-T or G-T G-T G-T guarantee less
> > internal degrees of freedom within a DNA molecule then A-T G-C A-T?
> It would take little information to describe a repeating sequence like
> A-T-A-T-A-T.... and few ways to change it's micro-state without altering
> its macro orderly appearance,
Describe it to who? Macro appearance to what? If you live alone on a
planet that is only liquid, how does one 'describe' a repeating
sequence? Besides your own mind, what would tell you that A-T-A-T-A-
T... can be expressed in any other way other than what it literally
> so it has a very low entropy, but it would
> take a lot of information to describe a random sequence A-T G-C A-T... and
> lots of ways to alter it's micro-state with it still looking random, so it
> has a high entropy.
So you are saying water has more information than DNA, but DNA that is
completely random has the same amount (or less) information than the
DNA that belonged to Beethoven. A symphony then would have less
information and more entropy than random noise. If the word
information is to have any meaning, quantity and compressibility of
data must be distinguished from quality of it's interpretation. Which
of course parallels the AI treatment of intelligence (trivial or
quantitative processing capacity) and cognitive awareness
> > I see no reason to use the word information at all for this. It sounds
> > like you are just talking about entropy to me.
> As I said, think about entropy as a measure of the number of ways you can
> change the micro-structure of something without changing its large scale
> macro appearance.
I don't think it's a good definition because micro and macro are
relative to an observer, not to the universe, but I understand what
you mean. There really is no definition related to order or pattern
that isn't subjective. The degree to which something's 'large scale
macro appearance' changes is contingent entirely on our ability to
perceive and recognize the changes.
Let's say your definition were true though. What does it have to do
with information being directly proportionate to entropy? If entropy
were equal or proportionate to information, then are saying that the
more information something contains, the less it matters. The more
information you have on the micro level, the less you can tell at the
macro. It seems obvious that they are inversely proportional. To
inform something is to reduce it's entropy (which necessarily means
increasing entropy somewhere else...entropy is all about space). I
build a sand castle and it has lower entropy than the rest of the
beach. Over time, the sand will return to the beach and we say the
entropy has returned to the higher beach level. If I encase the
sandcastle in lucite, it will slow down that process tremendously
because the form has no space to fall away from the castle.
> > > If I have red legos and white legos, and I build two opposite monochrome
> > houses and one of mixed blocks, how in the world does that effect the
> > entropy of the plastic bricks in any way?
> It does not effect the entropy of the plastic bricks but it does change the
> entropy of the structures built with those plastic bricks. For a single
> part in isolation entropy is not defined, a single water molecule has no
> entropy but a trillion trillion of them in a drop of water does.
Ok, so how does it effect the entropy of the structures? The red
house, the white house, and the mixed house (even if an interesting
pattern is made in the bricks), all behave in a physically identical
way, do they not? That would seem to preclude information itself from
having any objective material presence.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at