Re: [Fis] _comment to the "A Priori Modeling of Information"

2016-06-26 Thread Loet Leydesdorff
As a first step in the specification of the relevance of Shannon's
engineering model for developing a theory of meaning, Weaver (1949, at p.
26) proposed two minor additions to Shannon's diagram of a communication
channel, as follows: 

 

"One can imagine, as an addition to the diagram, another box labeled
"Semantic Receiver" interposed between the engineering receiver (which
changes signals to messages) and the destination. This semantic receiver
subjects the message to a second decoding, the demand on this one being that
it must match the statistical semantic characteristics of the message to the
statistical semantic capacities of the totality of receivers, or of that
subset of receivers which constitute the audience one wishes to affect. 

Similarly one can imagine another box in the diagram which, inserted between
the information source and the transmitter, would be labeled "semantic
noise," the box previously labeled as simply "noise" now being labeled
"engineering noise." From this source is imposed into the signal the
perturbations or distortions of meaning which are not intended by the source
but which inescapably affect the destination. And the problem of semantic
decoding must take this semantic noise into account." 

 

 

cid:image003.png@01D1CF8B.1F207680

 

Figure 1: Weaver's (1949) "minor" additions penciled into Shannon's (1948)
original diagram.

 

Since the "semantic receiver" recodes the information in the messages
(received from the "engineering receiver" who only changes signals into
messages) while having to assume the possibility of "semantic noise," a
semantic relationship between the two new boxes can also be envisaged. Given
Shannon's framework, however, this relation cannot be considered as another
information transfer-since semantics are defined as external to Shannon's
engineering model. 

 

Semantics are not based on specific communications, but on relations among
patterns of relations or, in other words, correlations. In the case of a
single relation, the relational distance is not different from the
correlational one; but in the case of relations involving three (or more)
agents, the distances in the vector space are different from the Euclidean
distances in the network space. In a triplet, the instantiation of one or
the other relation can make a difference for the further development of the
triadic system of relations. 

 

A system of relations can be considered as a semantic domain (Maturana,
1978). In other words, the sender and receiver are related in the graph of
Figure 1, while they are correlated in terms of not necessarily instantiated
relations in the background. The structure of correlations provides a latent
background that provides meaning to the information exchanges in relations.
The correlations are based on the same information, but the representation
in the vector space is different from the graph in the network space of
observable relations. 

 

In other words, meaning is not added to the information, but the same
information is delineated differently and considered from a different
perspective (including absent relations; i.e., zeros in the distribution).
As against Shannon-type information which flows linearly from the sender to
the receiver, one can expect meanings to loop, and thereby, to develop
next-order dimensionalities. New meanings generate new options and thus
redundancy. In my opinion, the task is to specify mechanisms which generate
redundancy (cf. Leydesdorff & Ivanova, 2014).

 

Source: Loet Leydesdorff, Alexander Petersen, and Inga A. Ivanova, The
Self-Organization of Meaning and the Reflexive Communication of Information.
  Social Science Information (in press).
 
Loet Leydesdorff and Inga A. Ivanova, Mutual Redundancies in Inter-human
Communication Systems: Steps Towards a Calculus of Processing Meaning
 , Journal of the Association for
Information Science and Technology 65(2) (2014) 386-399.

 

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

  l...@leydesdorff.net ;
 http://www.leydesdorff.net/ 
Associate Faculty,   SPRU, University of
Sussex; 

Guest Professor   Zhejiang Univ., Hangzhou;
Visiting Professor,   ISTIC,
Beijing;

Visiting Professor,   Birkbeck, University of London;


 
http://scholar.google.com/citations?user=ych9gNYJ&hl=en

 




  _  

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] _comment to the "A Priori Modeling of Information"

2016-06-25 Thread Emanuel Diamant
 

Dear FIS all,

Dear Marcus, 

 

The video "A Priori Modeling of Information" looks great, but when I have to
provide an opinion about something brought to my judgment I prefer to read
it rather to listen to its vocalized presentation. Therefore, I asked you to
provide me with a printed version of your proposal. You have kindly
fulfilled my request and therefore I owe you many, many thanks in return.
However, it turns out that all the efforts were in vein - from the printed
texts, I also did not understand nothing. They (the texts) are full with
unknown to me and bizarre notions: "universal meaning", "aesthetic entropy",
"generative informatics", "entropic mimicry", "behavioral entropy" and so
on. From this mass of unknown notions only one was somehow close to my level
of comprehension - the "theory of meaning". Therefore, with your permission,
I will try to comment only on this particular point of your presentation.

 

At least in 3 out of 4 of your documents you mention the Shannon-Weaver
(1949) "theory of meaning" as a basic key component of your attempts to
derive a meaningful informational view. 

A fast glance in the Weaver's part of The Mathematical Theory of
Communication (a composite of two separate papers, one Shannon's and one
Weaver's) reveals the context in which Weaver uses this term:

 

"The concept of information developed in this theory at first seems
disappointing and bizarre-disappointing because it has nothing to do with
meaning and bizarre because it deals not with a single message but rather
with the statistical character of a whole ensemble of messages.

 

I think, however, that these should be only temporary reactions; and that
one should say, at the end, that this analysis has so penetratingly cleared
the air that one is now, perhaps for the first time, ready for a real theory
of meaning.

 

This idea that a communication system ought to try to deal with all possible
messages, and that the intelligent way to try is to base design on the
statistical character of the source, is surely not without significance for
communication in general." (Weaver, 1949, p. 27)

 

This is the one and the only occasion when Weaver uses this term: a real
theory of meaning. He uses it in a hypothetical form - perhaps for the first
time, ready for. Meanwhile, The concept of information developed in this
theory. has nothing to do with meaning. A very important and interesting
point is that Weaver implicitly binds the new theory of meaning with an
attempt to base design on the statistical character of the source.  

 

At the same time (in his part of the same paper), Shannon was much more
determined and strong about the issue: "These semantic aspects of
communication are irrelevant to the engineering problem. It is important to
emphasize, at the start, that we are not concerned with the meaning or the
truth of messages; semantics lies outside the scope of mathematical
information theory". Shannon does not use any euphemisms (like theory of
meaning) to explain his intensions, he calls the child by his real name -
semantics! That is the name of his choice! Essentially semantic information
is the name of the issue that is at the heart of all our current
discussions, of your (Marcus) current proposal. 

 

Three years later, in 1952, Yehoshua Bar-Hillel and Rudolf Carnap have
coined the term "Semantic Information" that since then has become the
dominant theme of the ongoing scientific discourse. Again: semantic
information and not the theory of meaning, which you try to revitalize.

 

The reason why the notion of "Semantic Information" has not become a
legitimate part of the information theory is that Bar-Hillel and Carnap have
tried to derive it from the assumptions of the principal Shannon's
information theory. The same as Weaver, when he spoke about his hope of the
theory of meaning future development. The same as you (Marcus), in your
attempt to revive it in your proposal. The same as Terrence Deacon (whom you
quote in support of your claims), in his recent publications: showing how
the concept of entropy can be used to explain the relationship between
information, meaning and work.

 

Shannon in his 1956 paper, "The Bandwagon", has warned against such a misuse
of his information theory: "In short, information theory is currently
partaking of a somewhat heady draught of general popularity. It will be all
too easy for our somewhat artificial prosperity to collapse overnight when
it is realized that the use of a few exciting words like information,
entropy, redundancy, do not solve all our problems". These are Shannon's
words. But who cares? 

 

Pedro does not like when I begin to preach and to teach FIS people that the
Sun is rising on the East.

Okay, I agree, and accept, and obey his constraints. For that reason, I will
shut up with my comments.

 

My best and kind regards to all of you,

Yours, Emanuel Diamant.

 

___
Fis mailing list
Fis@listas.u