Dear Loet,
You nicely illustrate the problem as a “hole“ in the center of the various 
perspectives. All these current and futures perspectives are indeed needed but 
it is true that “a general theory of information” remains terrribly 
challenging, precisely due to the sometimes orthogonal perspectives of the 
different theories, as you say.
Now,  perhaps the “hole” can be used as a image leading us far back in time 
when our universe was only about matter and energy. The evolution of our 
universe could then be used as a reference frame for the history of information.
Such time guided background can be used for all the various perspectives and 
also highlights pitfalls like the mysterious natures of life and human mind.
This brings us to take life as a starting point for the being of meaningful 
information (as said, information should not be separated from meaning. Weaver 
rightly recomended not to confuse meaning with information. It is not about 
separating them).
So we could begin by positioning our investigations between life and human mind 
to address the natures of information and meaning, which are realities at that 
level and can there be modeled in quite simple terms.
Then, being carefull with human mind, we could go to human management of 
information and consider human acheivements and current works: the measurement 
of quantity (channel capacity, Shannon), the formalizations (physical, 
referential, normative, syntactic, semantic, pragmatic, constraint satisfaction 
oriented,  your communcation/sharing of meaning or information, ...).
This does not really fill the “hole” but it brings in evolution as a thread 
which leads to start with the simplest task.
Wishing you and all FISers the best for this year end and for the coming 2017.

De : Fis <> de la part de Loet Leydesdorff 
Envoyé : lundi 26 décembre 2016 14:01
À : 'Terrence W. DEACON'; 'Francesco Rizzo'; 'fis'
Objet : Re: [Fis] What is information? and What is life?

In this respect Loet comments:

"In my opinion, the status of Shannon’s mathematical theory of information is 
different  from special theories of information (e.g., biological ones) since 
the formal theory enables us to translate between these latter theories."

We are essentially in agreement, and yet I would invert any perspective that 
prioritizes the approach pioneered by Shannon.

Dear Terrence and colleagues,

The inversion is fine with me as an exploration. But I don’t think that this 
can be done on programmatic grounds because of the assumed possibility of “a 
general theory of information”. I don’t think that such a theory exists or is 
even possible without assumptions that beg the question.

In other words, we have a “hole” in the center. Each perspective can claim its 
“generality” or fundamental character. For example, many of us entertain a 
biological a priori; others (including you?) reason on the basis of physics. 
The various (special) theories, however, are not juxtaposed; but can be 
considered as other (sometimes orthogonal) perspectives. Translations are 
possible at the bottom by unpacking in normal language or sometimes more 
formally (and advanced; productive?) using Shannon’s information theory and 
formalizations derived from it.

I admit my own communication-theoretical a priori. I am interested in the 
communication of knowledge as different from the communication of information. 
Discursive knowledge specifies and codifies meaning. The communication/sharing 
of meaning provides an in-between layer, which has also to be distinguished 
from the communication of information. Meaning is not relational but 
positional; it cannot be communicated, but it can be shared. I am currently 
working (with coauthors) on a full paper on the subject. The following is the 
provisional abstract:

As against a monadic reduction of knowledge and meaning to signal processing 
among neurons, we distinguish among information and meaning processing, and the 
possible codification of specific meanings as discursive knowledge. Whereas the 
Shannon-type information is coupled to the second law of thermodynamics, 
redundancy—that is, the complement of information to the maximum entropy—can be 
extended by further distinctions and the specification of expectations when new 
options are made feasible. With the opposite sign, the dynamics of knowledge 
production thus infuses the historical (e.g., institutional) dynamics with a 
cultural evolution. Meaning is provided from the perspective of hindsight as 
feedback on the entropy flow. The circling among dynamics in feedback and 
feedforward loops can be evaluated by the sign of mutual information. When 
mutual redundancy prevails, the resulting sign is negative indicating that more 
options are made available and innovation can be expected to flourish. The 
relation of this cultural evolution with the computation of anticipatory 
systems can be specified; but the resulting puzzles are a subject for future 




Loet Leydesdorff

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR) <> ;
Associate Faculty, SPRU, <> University of Sussex;

Guest Professor Zhejiang Univ.<>, Hangzhou; 
Visiting Professor, ISTIC, <> Beijing;

Visiting Professor, Birkbeck<>, University of London;

Fis mailing list

Reply via email to