At 09:01 PM 12/10/2007, bob logan wrote:
Loet et al - I guess I am not convinced that information and entropy
are connected. Entropy in physics has the dimension of energy divided
by temperature. Shannon entropy has no physical dimension - it is
missing the Boltzman constant. Therefore how can entropy and shannon
entropy be compared yet alone connected?

Bob,

Temperature has the dimensions of energy per degree of freedom.
Do the dimensional analysis, and you end up with a measure in
degrees of freedom. This is a very reasonable dimensionality
for information.

I am talking about information not entropy - an organized collection
of organic chemicals must have more meaningful info than an
unorganized collection of the same chemicals.

I am planning to make some general comments of meaning, but
I am too busy right now. They will have to wait for later. There are
some very tricky issues involved, but I will say right now that
information is not meaningful, but has only a potential for meaning.
All information must  be interpreted to be meaningful, and the same
information can have very different meanings depending on how it
is interpreted. Information, on the other hand, has an objective
measure independent of interpretation, and that depends on
the measure of asymmetry within a system. See the recent book
by my student Scott Muller for details,  Asymmetry: The Foundation
of Information, Springer 2007.

http://www.amazon.ca/Asymmetry-Foundation-Information-Scott-Muller/dp/3540698833

This whole discussion on meaning needs far more precision and a
lot of garbage collecting.

Cheers,
John




----------
Professor John Collier                                     [EMAIL PROTECTED]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292       F: +27 (31) 260 3031
http://www.ukzn.ac.za/undphil/collier/index.html
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

Reply via email to