Dear Joseph - once again your post was most stimulating, provocative
and enjoyable. Kolmogorov's definition of information that you quote
is most interesting but like Shannon's definition incorporates the
notion that information is a quantitative concept that can be
measured. The Bateson definition that you refer to was a critique of
this notion of information as a quantitative measure. The criticism
began with MacKay (1969 Information, Mechanism and Meaning.
Cambridge MA: MIT Press.) who wrote "Information is a distinction
that makes a difference" which Bateson (1973 Steps to an Ecology of
Mind. St. Albans: Paladin Frogmore.) then built on to come up with
the more popular: "Information is a difference that makes a
difference". MacKay was the first to critique Shannon's quantitative
definition of information when Shannon wrote his famous definition:
"We have represented a discrete information source as a Markoff
process. Can we define a quantity, which will measure, in some sense,
how much information is ‘produced’ by such a process, or better, at
what rate information is produced?" – Shannon (1948 . A mathematical
theory of communication. Bell System Technical Journal, vol. 27, pp.
379-423 and 623-656, July and October, 1948.)
According to Claude Shannon (1948, p. 379) his definition of
information is not connected to its meaning. Weaver concurred in his
introduction to Shannon’s A Mathematical Theory of Communication when
he wrote: “Information has ‘nothing to do with meaning’ although it
does describe a ‘pattern’." Shannon also suggested that information
in the form of a message often contains meaning but that meaning is
not a necessary condition for defining information. So it is possible
to have information without meaning, whatever that means.
Not all of the members of the information science community were
happy with Shannon’s definition of information. Three years after
Shannon proposed his definition of information Donald Mackay (1951)
at the 8th Macy Conference argued for another approach to
understanding the nature of information. The highly influential Macy
Conferences on cybernetics, systems theory, information and
communications were held from 1946 to 1953 during which Norbert
Wiener’s newly minted cybernetic theory and Shannon’s information
theory were discussed and debated with a fascinating
interdisciplinary team of scholars which also included Warren
McCulloch, Walter Pitts, Gregory Bateson, Margaret Mead, Heinz von
Foerster, Kurt Lewin and John von Neumann. MacKay argued that he did
not see “too close a connection between the notion of information as
we use it in communications engineering and what [we] are doing here…
the problem here is not so much finding the best encoding of symbols…
but, rather, the determination of the semantic question of what to
send and to whom to send it.” He suggested that information should be
defined as “the change in a receiver’s mind-set, and thus with
meaning” and not just the sender’s signal (Hayles 1999b, p. 74). The
notion of information independent of its meaning or context is like
looking at a figure isolated from its ground. As the ground changes
so too does the meaning of the figure.
The last two paragraphs are an excerpt from my new book What is
Information? to be published by the University of Toronto Press in
late 2010 or early 2011. Your post Joseph has stimulated the
following thoughts that I hope to add to my new book before it is
typeset.
As MacKay and Bateson have argued there is a qualitative dimension to
information not captured by the Shannon Weaver quantitative model nor
by Kolmogorov's definition. Information is multidimensional. There is
a quantitative dimension as captured by Shannon and Kolmogorov and a
qualitative one of meaning as captured by MacKay and Bateson but one
can think of other dimensions as well. In responding to a
communication by Joseph Brenner on the Foundations of Information
(FIS) listserv I described the information that he communicated as
stimulating, provocative and enjoyable. Brenner cited the following
Kolmogorov definition of information as “any operator which changes
the distribution of probabilities in a given set of events.”
Brenner's information changed the distribution of my mental events to
one of stimulation, provocation and enjoyment and so there is
something authentic that this definition of Kolmogorov captures that
his earlier cited definition of information as "the minimum
computational resources needed to describe a program or a text" does
not. We therefore conclude that not only is there a relativistic
component to information but it is also multidimensional and not uni-
dimensional as is the case with Shannon information.
Joseph - many thanks for your stimulating post - I look forward to
your comments on this riff on your thoughts. - Bob
______________________
Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto
www.physics.utoronto.ca/Members/logan
On 22-Feb-10, at 12:43 AM, joe.bren...@bluewin.ch wrote:
Dear FIS Colleagues and Friends,
As you have for a long time before me, I have been trying to tame
(I prefer the French make private – apprivoiser) the notion of
information. One thought was suggested by Bateson’s seemingly
generally accepted dictum of “a difference (and/or distinction)
that makes a difference. But I think this difference is no ordinary
“delta”; this is an active referring or better differing term like
the différance of Derrida. I’m sure someone has made a reference to
this before – I’m new here – but then Derrida uses différance to
question the structure of binary oppositions, and says that
différance “invites us to undo the need for balanced equations, to
see if each term in an opposition is not after all an accomplice of
the other. At the point where the concept of différance intervenes,
all of the conceptual oppositions of metaphysics, to the extent
that they have for ultimate reference the presence of a present …
(signifier/signified; diachrony/synchrony; space/time; passivity/
activity, etc.) become non-pertinent. Since most of the usual
debates about information are based on such conceptual oppositions,
and classical notions of here and now, it may be high time to
deconstruct them.
I am sure you are familiar with this, but I found it rather
interesting to read that Kolmogorov had given one definition of
information as “any operator which changes the distribution of
probabilities in a given set of events”. (Apparently, this idea was
attacked by Markov.)
Différance in the informational context then started looking to me
like an operator, especially since in my process logic, where
logical elements of real processes resemble probabilities, the
logical operators are also processes, such that a predominantly
actualized positive implication, for example, is always accompanied
by a predominantly potentialized negative implication.
At the end of all this, then, one has, starting from the lowest level:
a) information as what is processed by a computer;
b) information as a scalar quantity of uncertainty removed,
the entropy/negentropy picture;
c) semantic information as well-formed, meaningful data
(Floridi);
d) information as a process operator that makes a difference
to and for other processes, including above all those of receivers
and senders.
A first useful consequence is that information “operations” with my
operator are naturally polarized, positive, negative or some
combination which I’ll leave open for the moment. The negative
effects of some information follow naturally. Many of you may
conclude I’m doing some oversimplification or conflation, and I
apologize for that in advance. But I believe that Kolmogorov’s
original idea has been neglected in the recent discussions of
information I’ve seen, and I would very much welcome comments.
Thank you and best wishes.
Joseph
_______________________________________________
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis
_______________________________________________
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis