Dear Loet,

Shannon "information" is indeed counter-intuitive, and we have John von
Neumann's joke to thank for that confusion.Shannon asked von Nuemann what
to name his formula. von Neumann told him to name it "entropy", because his
metric is "formally identical to von Boltzmann's probabilistic measure of
entropy, and because no one really knows what entropy is, so you'll be
ahead in any argument!" Whence the conflation of entropy with Shannon's
measure of diversity.

"Meaningful information" is calculated in Bayesian fashion by comparing the
*differences* between the apriori and aposteriori distributions (or sender
and receiver in the *subdiscipline* of communication). It is called the
"average mutual information", (AMI) which serves as a form of "proto
meaning" -- detractors of the Shannon approach notwithstanding. You, in
fact, have published numerous fine papers centered about AMI.

The residual between entropy and AMI is called the "conditional entropy".
This warns us to be careful concerning entropy and information: The
probabilistic entropy of both Boltzmann and Shannon doesn't quite
characterize true entropy. <> It encompasses *both*
didactic constraint (AMI) and its absence (an apophasis).  (Entropy,
strictly speaking, is an apophasis, which is why we have such trouble
wrapping our minds around the concept!) Thermodynamic entropy, measured by
heat and temperature, comes closer to grasping the true nature of the
apophasis, but even there ambiguity remained, and it became necessary to
postulate the third law of thermodynamics. (Entropy at zero degrees Kelvin
is zero.) (N.b., physicists "define" entropy as the Boltzmann formula in a
vain effort to "sanitize" thermodynamics. The messy engineering roots of
thermodynamics have always been an irritant for physicists, going all the
way back to the time of Carnot.)

And so we need to be careful when approaching the concepts of information
and entropy. We need to think first before dismissing the Shannon approach
as having no relation to meaning (albeit in a primitive fashion), and we
need always to keep in mind that *both* concepts are always relative in
nature -- never absolute! <

My best to all,
Bob U.

On Sat, Nov 18, 2017 at 3:18 AM, Loet Leydesdorff <>

> Dear Terry and colleagues,
> I agree that one should not confuse communication with the substance of
> communication (e.g., life in bio-semiotics). It seems useful to me to
> distinguish between several concepts of "communication".
> 1. Shannon's (1948) definitions in "The Mathematical Theory of
> Communication". Information is communicated, but is yet meaningfree. These
> notions of information and communication are counter-intuitive (Weaver,
> 1949). However, they provide us with means for the measurement, such as
> bits of information. The meaning of the communication is provided by the
> system of reference (Theil, 1972); in other words, by the specification of
> "what is comunicated?" For example, if money is communicated
> (redistributed), the system of reference is a transaction system. If
> molecules are communicated, life can be generated (Maturana).
> 2. Information as "a difference which makes a difference" (Bateson, 1973;
> McKay, 1969). A difference can only make a difference for a receiving
> system that provides meaning to the system. In my opinion, one should in
> this case talk about "meaningful information" and "meaningful
> communication" as different from the Shannon-type information (based on
> probability distributions). In this case, we don't have a clear instrument
> for the measurement. For this reason, I have a preference for the
> definitions under 1.
> 3. Interhuman communication is of a different order because it involves
> intentionality and language. The discourses under 1. and 2. are interhuman
> communication systems. (One has to distinguish levels and should not impose
> our intuitive notion of communication on the processes under study.) In my
> opinion, interhuman communication involves both communication of
> information and possibilities of sharing meaning.
> The Shannon-type information shares with physics the notion of entropy.
> However, physical entropy is dimensioned (Joule/Kelvin; S = k(B) H),
> whereas probabilistic entropy is dimensionless (H). Classical physics, for
> example, is based on the communication of momenta and energy because these
> two quantities have to be conserved. In the 17th century, it was common to
> use the word "communication" in this context (Leibniz).
> Best,
> Loet
Fis mailing list

Reply via email to