Dear Hector, 


I am very impressed by your letter to Joe Brener. You are absolutely right
when you say that Shannon's information is obsolete and is of limited use.
But you are wrong saying that (it) "should not even be mentioned any longer
in serious discussions about information". As far as I understand the
matters, Shannon's information is a valuable component of the complex notion
of information, which is composed of two subparts - physical and semantic
information. Therefore, Shannon information cannot be dismissed from the
notion of information. Just as other forms of physical information: Fisher
information, Algorithmic information, Kolmogorov complexity, and as such. On
the other hand, semantic information is not a derivative from and not an
extension of physical information.


FIS people know nothing about that, (about physical and semantic information
and their coexistence). And they even don't want to know. Therefore, I
discontinue my further comments on this subject (and on your letter). 


On the timetable of our current discussion is Marcus Abundis idea about "A
Priori Modeling". And that is what we have to speak about. You can like it
or not, you can accept it or argument against it, but you cannot amend it,
or improve it, or come with your own much better version of it (Loet,
Krassimir, Bob, Stan and others).


So, let us comply to the rules.


Best regards,




From: Fis [] On Behalf Of Hector Zenil
Sent: Wednesday, June 29, 2016 4:37 PM
To: joe.brenner
Cc: fis
Subject: Re: [Fis] Shannonian Mechanics?


I think complaining about Shannon entropy as a measure of information is
completely justified because it is steam-engine physics unfortunately still
widely used despite its many flaws and limitations.


But to think that Shannon entropy is at the front-end in the mathematical
discussion of information is a mistake and this, and other groups, have
perpetually been entrapped in a 60s and 70s discussion on a fake ancient
theory of information that not even Shannon himself thought was worth to be
used for anything meaningful in information but for communication measuring
purposes only.


Indeed, Shannon entropy is nothing else but a counting function of
states/symbols, at best it is a measure of diversity, a bound on information
transfer. The technical and philosophical discussion here and everywhere
else should be (and has been among those at the scientific front) focused on
what has been done in the last 50 years to leave Shannon entropy behind, but
nobody here (and almost nowhere else) are people discuss about algorithmic
randomness, Levin's universal distribution, measures of sophistication, etc.
but prefer to be in a continuous state of pre 60s Shannon entropy


Shannon entropy should not even be mentioned any longer in serious
discussions about information, we moved on a long time ago (unfortunately
not even many physicists have done)


Trying to be constructive. All best,


- Hector 


Fis mailing list

Reply via email to