Dear Loet, Dear All, Thank you, Loet, for this very clear expression of the "state-of-the-art" with respect to a dynamic design or measure for meaningful information as associated with negentropy. Today, I would view this position as the analytical skeleton of the real three-level dynamics of information-as-process being described by Deacon in his new book.
As I understand his view, one can only capture the origin of meaningful information by reference to the detailed dynamics, basic thermodynamical, biological and teleodynamical that are involved in real cogntive processes. At several points, in discussing the role of constraints as /absences/ that determine emergent phenomena, Deacon says that understanding this concept requires a "figure/background" reversal in our thinking, a new metaphysical sophistication and a willingness to "intertwine perspectives". It also means openness to rigorous thinking that must manage without proofs in the standard sense, the failure of which should be obvious by now. Such an approach does, however, take a lot more work and study. I look forward to seeing ways of how best to accomplish this in future notes, not only with respect to Deacon (and Brenner) but generally. Best regards, Joseph ----- Original Message ----- From: Loet Leydesdorff To: 'Pedro C. Marijuan' ; fis@listas.unizar.es Sent: Tuesday, October 25, 2011 8:53 AM Subject: Re: [Fis] [Fwd: Re: FW: Meaning Information Theory] ---From Gavin Dear Gavin, The notion of meaningful information is associated with negentropy. It can only be information within a system (e.g., an observer) which/who provides the information with meaning. You can also consider it as observed information; different from Shannon's information which remains expected information. I agree that it is confusing. The origins are to be found with Brillouin (for the formalism) and with Bateson who defined information as "a difference which makes a difference". Shannon-type information can be considered as only differences (in a probability distribution). These first-order difference can only make a difference for a system of reference. The specification of the system of reference provides the information with dimensionality and thus meaning. The above is a static design. In the dynamic design (a la Brillouin), one can use the Kullback-Leibler divergence measure to compare the a posteriori state with the a priori one. However, this divergence (I) is necessarily positive (Theil, 1972 for the proof); it is Shannon-type information. The only measure of negative information (negentropy) is the mutual information in three or more dimensions. Krippendorff (2009) showed that this signed information measure (Yeung, 2008) can be considered as the difference between the redundancy generated within the system and the Shannon-type information generated in the interactions. One can compute with this measure (e.g., Leydesdorff, 2010). This operationalized "the difference which makes a difference": the result can be an increase or decrease of the uncertainty that prevails in the observing system. References: Bateson, G. (1972). Steps to an Ecology of Mind. New York: Ballantine. Brillouin, L. (1962). Science and Information Theory. New York: Academic Press. Krippendorff, K. (2009). Information of Interactions in Complex Systems. International Journal of General Systems, 38(6), 669-680. Leydesdorff, L. (2010). Redundancy in Systems which Entertain a Model of Themselves: Interaction Information and the Self-organization of Anticipation Entropy, 12(1), 63-79; doi:10.3390/e12010063. Theil, H. (1972). Statistical Decomposition Analysis. Amsterdam/ London: North-Holland. Yeung, R. W. (2008). Information Theory and Network Coding. New York, NY: Springer. -------------------------------------------------------------------------------- Loet Leydesdorff Professor, University of Amsterdam Amsterdam School of Communications Research (ASCoR), Kloveniersburgwal 48, 1012 CX Amsterdam. Tel.: +31-20- 525 6598; fax: +31-842239111 l...@leydesdorff.net ; http://www.leydesdorff.net/
_______________________________________________ fis mailing list fis@listas.unizar.es https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis