Dear FIS all,

Dear Marcus, 

 

The video "A Priori Modeling of Information" looks great, but when I have to
provide an opinion about something brought to my judgment I prefer to read
it rather to listen to its vocalized presentation. Therefore, I asked you to
provide me with a printed version of your proposal. You have kindly
fulfilled my request and therefore I owe you many, many thanks in return.
However, it turns out that all the efforts were in vein - from the printed
texts, I also did not understand nothing. They (the texts) are full with
unknown to me and bizarre notions: "universal meaning", "aesthetic entropy",
"generative informatics", "entropic mimicry", "behavioral entropy" and so
on. From this mass of unknown notions only one was somehow close to my level
of comprehension - the "theory of meaning". Therefore, with your permission,
I will try to comment only on this particular point of your presentation.

 

At least in 3 out of 4 of your documents you mention the Shannon-Weaver
(1949) "theory of meaning" as a basic key component of your attempts to
derive a meaningful informational view. 

A fast glance in the Weaver's part of The Mathematical Theory of
Communication (a composite of two separate papers, one Shannon's and one
Weaver's) reveals the context in which Weaver uses this term:

 

"The concept of information developed in this theory at first seems
disappointing and bizarre-disappointing because it has nothing to do with
meaning and bizarre because it deals not with a single message but rather
with the statistical character of a whole ensemble of messages.

 

I think, however, that these should be only temporary reactions; and that
one should say, at the end, that this analysis has so penetratingly cleared
the air that one is now, perhaps for the first time, ready for a real theory
of meaning.

 

This idea that a communication system ought to try to deal with all possible
messages, and that the intelligent way to try is to base design on the
statistical character of the source, is surely not without significance for
communication in general." (Weaver, 1949, p. 27)

 

This is the one and the only occasion when Weaver uses this term: a real
theory of meaning. He uses it in a hypothetical form - perhaps for the first
time, ready for. Meanwhile, The concept of information developed in this
theory. has nothing to do with meaning. A very important and interesting
point is that Weaver implicitly binds the new theory of meaning with an
attempt to base design on the statistical character of the source.  

 

At the same time (in his part of the same paper), Shannon was much more
determined and strong about the issue: "These semantic aspects of
communication are irrelevant to the engineering problem. It is important to
emphasize, at the start, that we are not concerned with the meaning or the
truth of messages; semantics lies outside the scope of mathematical
information theory". Shannon does not use any euphemisms (like theory of
meaning) to explain his intensions, he calls the child by his real name -
semantics! That is the name of his choice! Essentially semantic information
is the name of the issue that is at the heart of all our current
discussions, of your (Marcus) current proposal. 

 

Three years later, in 1952, Yehoshua Bar-Hillel and Rudolf Carnap have
coined the term "Semantic Information" that since then has become the
dominant theme of the ongoing scientific discourse. Again: semantic
information and not the theory of meaning, which you try to revitalize.

 

The reason why the notion of "Semantic Information" has not become a
legitimate part of the information theory is that Bar-Hillel and Carnap have
tried to derive it from the assumptions of the principal Shannon's
information theory. The same as Weaver, when he spoke about his hope of the
theory of meaning future development. The same as you (Marcus), in your
attempt to revive it in your proposal. The same as Terrence Deacon (whom you
quote in support of your claims), in his recent publications: showing how
the concept of entropy can be used to explain the relationship between
information, meaning and work.

 

Shannon in his 1956 paper, "The Bandwagon", has warned against such a misuse
of his information theory: "In short, information theory is currently
partaking of a somewhat heady draught of general popularity. It will be all
too easy for our somewhat artificial prosperity to collapse overnight when
it is realized that the use of a few exciting words like information,
entropy, redundancy, do not solve all our problems". These are Shannon's
words. But who cares? 

 

Pedro does not like when I begin to preach and to teach FIS people that the
Sun is rising on the East.

Okay, I agree, and accept, and obey his constraints. For that reason, I will
shut up with my comments.

 

My best and kind regards to all of you,

Yours, Emanuel Diamant.

 

_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to