I think complaining about Shannon entropy as a measure of information is
completely justified because it is steam-engine physics unfortunately still
widely used despite its many flaws and limitations.
But to think that Shannon entropy is at the front-end in the mathematical
discussion of
Dear Loet,
The way you have asked it, I think the answer to your question is known: both
order and disorder are universals, linked dialectically. Never one without the
other, as for symmetry and asymmetry, except in trivially simple cases.
Cheers,
Joseph
Sent from Samsung Mobile.
Dear Pedro and colleagues,
The figure from Weaver in Loet's excellent posting leaves a few aspects
outside. The why, the what, the how long, the with whom, and other aspects
of the information phenomenon do not enter. By doing that we have
streamlined the phenomenon... and have left it ready
In an offline exchange, Michel asks some questions (on my reply to
Annette), summarized below.
===
> Your "material variation" seems identical to "spatial <
> structure" which is classically used in informational <
> ecology. Why not ? <
• “Why not?“ what? I am unsure of what you are asking. In
Dear Marcus, Loet, Bob... and All,
Again very briefly, your exchanges make clear the limits of the received
Shannonian approach and the (narrow?) corridors left for advancement. I
find this situation highly reminiscent of what happened with Mechanics
long ago: an excellent theory (but of