In this respect Loet comments:

 

"In my opinion, the status of Shannon’s mathematical theory of information is 
different  from special theories of information (e.g., biological ones) since 
the formal theory enables us to translate between these latter theories."

 

We are essentially in agreement, and yet I would invert any perspective that 
prioritizes the approach pioneered by Shannon. 

 

Dear Terrence and colleagues, 

 

The inversion is fine with me as an exploration. But I don’t think that this 
can be done on programmatic grounds because of the assumed possibility of “a 
general theory of information”. I don’t think that such a theory exists or is 
even possible without assumptions that beg the question. 

 

In other words, we have a “hole” in the center. Each perspective can claim its 
“generality” or fundamental character. For example, many of us entertain a 
biological a priori; others (including you?) reason on the basis of physics. 
The various (special) theories, however, are not juxtaposed; but can be 
considered as other (sometimes orthogonal) perspectives. Translations are 
possible at the bottom by unpacking in normal language or sometimes more 
formally (and advanced; productive?) using Shannon’s information theory and 
formalizations derived from it.

 

I admit my own communication-theoretical a priori. I am interested in the 
communication of knowledge as different from the communication of information. 
Discursive knowledge specifies and codifies meaning. The communication/sharing 
of meaning provides an in-between layer, which has also to be distinguished 
from the communication of information. Meaning is not relational but 
positional; it cannot be communicated, but it can be shared. I am currently 
working (with coauthors) on a full paper on the subject. The following is the 
provisional abstract: 

As against a monadic reduction of knowledge and meaning to signal processing 
among neurons, we distinguish among information and meaning processing, and the 
possible codification of specific meanings as discursive knowledge. Whereas the 
Shannon-type information is coupled to the second law of thermodynamics, 
redundancy—that is, the complement of information to the maximum entropy—can be 
extended by further distinctions and the specification of expectations when new 
options are made feasible. With the opposite sign, the dynamics of knowledge 
production thus infuses the historical (e.g., institutional) dynamics with a 
cultural evolution. Meaning is provided from the perspective of hindsight as 
feedback on the entropy flow. The circling among dynamics in feedback and 
feedforward loops can be evaluated by the sign of mutual information. When 
mutual redundancy prevails, the resulting sign is negative indicating that more 
options are made available and innovation can be expected to flourish. The 
relation of this cultural evolution with the computation of anticipatory 
systems can be specified; but the resulting puzzles are a subject for future 
research.

Best,

Loet

 

  _____  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

 <mailto:l...@leydesdorff.net> l...@leydesdorff.net ;  
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 
Associate Faculty,  <http://www.sussex.ac.uk/spru/> SPRU, University of Sussex; 

Guest Professor  <http://www.zju.edu.cn/english/> Zhejiang Univ., Hangzhou; 
Visiting Professor,  <http://www.istic.ac.cn/Eng/brief_en.html> ISTIC, Beijing;

Visiting Professor,  <http://www.bbk.ac.uk/> Birkbeck, University of London; 

 <http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en> 
http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en

 

 

_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to