On Jan 28, 1:48 pm, John Clark <johnkcl...@gmail.com> wrote:
> On Fri, Jan 27, 2012 at 5:51 PM, Craig Weinberg <whatsons...@gmail.com>wrote:
> > You could much more easily write a probabilistic equation to simulate any
> > given volume of water than the same volume of DNA, especially
> The motion of both can be well described by Napier-Stokes equations which
> describe fluid flow using Newton's laws, and DNA being more viscous than
> water the resulting equations would be simpler than the ones for water.
I'm not talking about fluid flow, I'm talking about simulating
everything - potential and actual chemical reactions, etc. Water can
be described by multiplying the known interactions of H2O, DNA would
need many more variables.
> > > when you get into secondary and tertiary structure.
> You've got to play fair, it you talk about micro states for DNA I get to
> talk about micro states for water.
> > I had not heard of Shannon information.
> Somehow I'm not surprised, and it's Shannon Information Theory.
No, I've heard of Shannon Information Theory. I didn't realize that it
was such an instrumental special case theory though.
> > The key phrase for me here is "the thermodynamic entropy is interpreted as
> > being proportional to the amount of further Shannon information needed to
> > define the detailed microscopic state of the
> > system".
> OK, although I don't see what purpose the word "further" serves in the
> above, and although I know all about Claude Shannon the term "Shannon
> information" is nonstandard. What would Non-Shannon information be?
Non-Shannon information would be anything that is not directly
involved in the compression of a digitally sampled description into
another digital description. Further means that if you add x calories
of heat, you need x more units of Shannon information to define the
effect of the added heat/motion.
> > > This confirms what I have been saying and is the opposite of what you
> > are saying.
> What on Earth are you talking about?? The more entropy a system has the
> more information needed to describe it.
Yes. It is information that lets you describe patterns more easily.
The more pattern there is, the more you can say 'yes, I get it, add
500 0s and then another 1'. When there is less information, less
pattern, more energy, it takes more information to describe it. There
are no patterns to give your compression a shortcut.
> > > This means that DNA, having low entropy compared with pure water, has
> > high pattern content, high information, and less Shannon information"
> I see, it has high information and less information. No I take that back, I
> don't see, although it is consistent with your usual logical standards.
Shannon information is not information in general, it is a specific
kind of information about information which is really inversely
proportional to information in any other sense. It's uninformability
is what it is. Drag. Entropy. Resistance to the process (not
> > Easier to compress does *not* mean less information
> It means a message has been inflated with useless gas and a compression
> program can remove that gas and recover the small kernel of information
Hahaha. The useless gas is what separates coherence and sanity from
garbage. It's useless to a computer, sure, but without the gas it's
useless to us. Next time you want to look at a picture, try viewing it
in it's compressed form in a hex editor. Get rid of all that useless
> White noise has no gas in it for a compression program to
> deflate, that's why if you don't know the specific compression program used
> the resulting file ( like a zip or gif file) would look like random white
> noise, and yet its full of useful information if you know how to get it.
> The same thing is true of encrypted files, if the encription is good then
> the file will look completely random, just white noise, to anyone who does
> not have the secret key.
I understand what you mean completely, and that is indeed how
computers treat data, but it is the opposite of what it means to
inform in general terms. Compression and encryption are deformations.
Decryption is how we get any information out of it. White noise is
called noise for a reason. The opposite of noise is signal. Signals
are signifying and informing, thus information.
> > The compressibility of a novel or picture does not relate to the quality
> > of information
> How do you expect mathematics to deal with anything as subjective as
> quality? A novel that's high quality to you may be junk to me.
I don't expect mathematics to deal with it. I expect a theory of
everything to deal with it.
> > Knowledge and wisdom are already owned by philosophy and religion,
> I've never heard of religion saying anything wise, philosophy does contain
> wisdom but none of it came from professional philosophers, at least not in
> the last 300 years.
I'm not a big philosophy or religion fan myself but Wittgenstein,
Heidegger, Sarte, Foucault, Kierkegaard were recent and had some
impressive things to say. But my point was that those terms are
associated too much with those traditions to be used meaningfully in a
new universal synthesis.
> > The human mind does not work like a computer
> As you've said before, but saying it does not make it so.
No but thinking about what it means does make it so. Here's some
sample articles on the subject:
> > it does not compress and decode memories
> Then the human mind works very inefficiently and needs improvement.
Why so we can fill it up with a petabytes of tooth brushing
> > are concrete analog presentations that re-present, *not* representations
> > and not digital data.
> Even asterisks do not make it so.
It already is made so, I'm just pointing it out. It's not a matter of
opinion. Blue is a presentation, not some generic quantitative
representation of non-blueness.
> > I don't think it's an exaggeration to say that 99% of people who use the
> > word information use it the way that I've been using it
> I don't think it's an exaggeration to say that if you wish to understand
> how mind works the verbiage generated by 99% of the people on this planet
> will be of no help to you whatsoever; better to listen to what the experts
> have to say about the subject.
"Science begins when you distrust experts.” - Richard Feynman.
You're right, I'll trust Feynman.
> > The bucket of water has higher thermodynamic entropy which requires more
> > Shannon information to describe.
> > > The encoded description of the water has more information if we were to
> > simulate it exactly
> > but that doesn't mean the original has more information,
> I see, it has more information but that doesn't mean it has more
> information. No I take that back, I don't see, although it is consistent
> with your usual logical standards.
Repeating yourself there. I hope you realize now that information is
not at all the same thing as Shannon information, and that there are
many uses of the word entropy. It's a fundamental concept like
'degeneration' so it gets used in a lot of different ways. If you want
to talk about how a physical object degenerates you might talk about
thermodynamic entropy. If you want to talk about how a crowd disperses
after a football game you might talk about entropy in a geographic
statistical sense. The two are not related. You can't make the crowd
disperse earlier by warming the stadium up.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at