Dear Xueshan,

You ask "how should we understand this paradox?"

I suggest that we start by looking at what it might mean for information or meaning to be 'contained' in a sentence. Lakoff would have told us that this is a metaphor, and specifically the pervasive 'container metaphor'. According to


Container metaphor.

A containment metaphor is an ontological metaphor in which some concept is represented as:

 *      having an inside and outside, and
 *      capable of holding something else.


 *      I’ve had a full life.
 *      Life is empty for him.
 *      Her life is crammed with activities.
 *      Get the most out of life.

Lakoff, George, and Mark Johnson. 1980. Metaphors we live by. Chicago: University of Chicago.


The paradox is dissolved by proposing that "In everyday speech it is usual to say that a sentence has 'an inside and an outside', and that it is 'capable of holding something else', but this is no more than a convenient fiction. Both 'information' and 'meaning' (in the senses you are using) are constituted by social and cognitive processes, and consideration of these processes can enable us to understand the relationship between the two terms".



On 26/02/18 09:47, Xueshan Yan wrote:

Dear colleagues,

In my teaching career of Information Science, I was often puzzled by the following inference, I call it *Paradox of Meaning and Information* or *Armenia Paradox*. In order not to produce unnecessary ambiguity, I state it below and strictly limit our discussion within the human context.

Suppose an earthquake occurred in Armenia last night and all of the main media of the world have given the report about it. On the second day, two students A and B are putting forward a dialogue facing the newspaper headline “*Earthquake Occurred in Armenia Last Night*”:

Q: What is the *MEANING* contained in this sentence?

A: An earthquake occurred in Armenia last night.

Q: What is the *INFORMATION* contained in this sentence?

A: An earthquake occurred in Armenia last night.

Thus we come to the conclusion that *MEANING is equal to INFORMATION*, or strictly speaking, human meaning is equal to human information. In Linguistics, the study of human meaning is called Human Semantics; In Information Science, the study of human information is called Human Informatics.

Historically, Human Linguistics has two definitions: 1, It is the study of human language; 2, It, also called Anthropological Linguistics or Linguistic Anthropology, is the historical and cultural study of a human language. Without loss of generality, we only adopt the first definitions here, so we regard Human Linguistics and Linguistics as the same.

Due to Human Semantics is one of the disciplines of Linguistics and its main task is to deal with the human meaning, and Human Informatics is one of the disciplines of Information Science and its main task is to deal with the human information; Due to human meaning is equal to human information, thus we have the following corollary:

A: *Human Informatics is a subfield of Human Linguistics*.

According to the definition of general linguists, language is a vehicle for transmitting information, therefore, Linguistics is a branch of Human Informatics, so we have another corollary:

B: *Human Linguistics is a subfield of Human Informatics*.

Apparently, A and B are contradictory or logically unacceptable. It is a paradox in Information Science and Linguistics. In most cases, a settlement about the related paradox could lead to some important discoveries in a subject, but how should we understand this paradox?

Best wishes,


Fis mailing list

Fis mailing list

Reply via email to