On 24.01.2012 13:49 Craig Weinberg said the following:

If you are instead saying that they are inversely proportional then
I would agree in general - information can be considered negentropy.
Sorry, I thought you were saying that they are directly proportional
measures (Brent and Evgenii seem to be talking about it that way). I

I am not an expert in the informational entropy. For me it does not matter how they define it in the information theory, whether as entropy or negentropy. My point is that this has nothing to do with the thermodynamic entropy (see my previous message with four cases for the string "10").


think that we can go further in understanding information though.
Negentropy is a good beginning but it does not address significance.
The degree to which information has the capacity to inform is even
more important than the energy cost to generate. Significance of
information is a subjective quality which is independent of entropy
but essential to the purpose of information. In fact, information
itself could be considered the quantitative shadow of the quality of
significance. Information that does not inform something is not


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to