Re: [Fis] Derrida's diferAnce and Kolmogorov's Information Operator

2010-02-22 Thread Darvas Gyorgy
Dear Joe, Dear FISers,

I would like to add a short remark.
In fact, I repeat a fact that I mentioned several times in the past.
We often miss to list a very simple form of 
information among the others, namely that, what I 
call information on existence.
This is a double-valued property, with the words 
of Joe, binary opposition. Something exists 
here and now, or does not. E.g., I am here or I 
am not here; you are hungry or you are not 
hungry; it is red or not red, that object has the 
same colour like this, or has different colour, etc.
(Note, difference is not certainly a binary 
category in itself, because it compares two 
things, where one of the compared things has one 
property, while the other side - asymmetrically - 
is many-valued. These many values may be finite, 
discrete infinite, or may belong to a smooth continuum.)

This observation comes for me from my experience 
with the description of symmetries in physics. 
For decades, there were described different kinds 
of symmetry by discrete groups, then by 
(continuous) Lie groups - more and more 
complicated appearances of symmetry phenomena. 
The significance of parity was recognised late, 
in the nineteen fifties, and I would say, that 
there are even now   (fortunately not all, only) 
many physicists who are surprised when meet the 
difference between the behaviour of parity in odd 
and even dimensional spaces. This is caused by 
lack of general knowledge about the nature of 
paritiy. Nevertheless, the most simple appearance of the phenomenon.

Something similar happened around existence type information.
It is important, it is lovely, worthy of 
affection. Let us not forget/neglect it.
Gyuri



At 06:43 22.02.2010, you wrote:
Dear FIS Colleagues and Friends,

As you have for a long time before me, I have 
been trying to tame (I prefer the French make 
private – apprivoiser) the notion of 
information. One thought was suggested by 
Bateson’s seemingly generally accepted dictum 
of “a difference (and/or distinction) that 
makes a difference. But I think this difference 
is no ordinary “delta”; this is an active 
referring or better differing term like the 
différance of Derrida. I’m sure someone has 
made a reference to this before – I’m new here 
 “ but then Derrida uses différance to 
question the structure of binary oppositions, 
and says that différance “invites us to undo 
the need for balanced equations, to see if each 
term in an opposition is not after all an 
accomplice of the other. At the point where the 
concept of différance intervenes, all of the 
conceptual oppositions of metaphysics, to the 
extent that they have for ultimate reference the 
presence of a present …(signifier/signified; 
diachrony/synchrony; space/time; 
passivity/aactivity, etc.) become non-pertinent. 
Since most of the usual debates about 
information are based on such conceptual 
oppositions, and classical notions of here and 
now, it may be high time to deconstruct them.

I am sure you are familiar with this, but I 
found it rather interesting to read that 
Kolmogorov had given one definition of 
information as “any operator which changes the 
distribution of probabilities in a given set of 
events”. (Apparently, this idea was attacked by Markov.)

Différance in the informational context then 
started looking to me like an operator, 
especially since in my process logic, where 
logical elements of real processes resemble 
probabilities, the logical operators are also 
processes, such that a predominantly actualized 
positive implication, for example, is always 
accompanied by a predominantly potentialized negative implication.

At the end of all this, then, one has, starting from the lowest level:
a)  information as what is processed by a computer;
b)  information as a scalar quantity of 
uncertainty removed, the entropy/negentropy picture;
c)  semantic information as well-formed, meaningful data (Floridi);
d)  information as a process operator that 
makes a difference to and for other processes, 
including above all those of receivers and senders.

A first useful consequence is that information 
“operations” with my operator are naturally 
polarized, positive, negative or some 
combination which I’ll leave open for the 
moment. The negative effects of some information 
follow naturally. Many of you may conclude I’m 
doing some oversimplification or conflation, and 
I apologize for that in advance. But I believe 
that Kolmogorov’s original idea has been 
neglected in the recent discussions of 
information I’ve seen, and I would very much 
welcome comments. Thank you and best wishes.

Joseph
___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

http://symmetry.hu/coming-meetings.htmlBridges 
Conference 2010, Pecshttp://symmetry.hu/coming-meetings.html, 24-28 July

http://vod.niif.hu/symmetry2009/Symmetry 
Festival 

Re: [Fis] Derrida's diferAnce and Kolmogorov's Information Operator

2010-02-22 Thread Loet Leydesdorff
At the end of all this, then, one has, starting from the lowest level:

!--[if !supportLists]--a)  !--[endif]--information as what is
processed by a computer;

!--[if !supportLists]--b)  !--[endif]--information as a scalar
quantity of uncertainty removed, the entropy/negentropy picture;

!--[if !supportLists]--c)  !--[endif]--semantic information as
well-formed, meaningful data (Floridi);

!--[if !supportLists]--d)  !--[endif]--information as a process
operator that makes a difference to and for other processes, including above
all those of receivers and senders.



Dear Joseph and colleagues, 
 
I agree with the distinction of four operations, but it seems to me that
this can be expressed more parsimoneously using information theory. Given
Bateson's (1972) formulation that information can be considered as a
difference which makes a difference,  one should distinguish between the
first type of differences and the second. Let's say difference(1) and
difference(2). (I'll need difference(3) and difference(4) below.)
 
A difference(1) can only make a difference(2) for a system (or more
generally the expectation of a system). This difference(2) is analytically
preceded by difference(1), that is, pure differences. Shannon-type
information is contained in probability distributions. In the binary case,
this is only one difference (Y/N, F/T, open/closed); in the non-binary case
probability distributions provide us with sets of differences(1). These
differences(1) can only make a difference(2) for a system which contains
other (orthogonal) differences. In this case one needs one-more (orthogonal)
dimension of the probability distribution that positions the incoming
(Shannon-type) information at specific moments in time. Thus, difference(2)
presumes at least a dimensionality of two in the probabilistic entropy.
 
When the system develops, difference(3) can be defined with reference to the
time axis (recursion). This is Brillouin's (1962) Delta H. The difference(1)
that made a difference(2) for the system makes a difference(3) over time.
When the system operates as a self-organizing, autonomous or autopoietic
system it is additionally able to provide the information with a meaning
from the perspective of hindsight, that is, against the axis of time. This
incursion can make a difference(4). 
 
In other words, one needs at least a vector (one dimension of the entropy)
for containing an uncertainty. One needs (at least) two dimensions of the
probabilistic entropy for positioning the information in a network (matrix)
at specific moments of time. Three dimensions are needed when the time axis
is additionally included; four when the direction in the time axis can be
considered as another degree of freedom.
 
The two approaches seem very akin to me, but I claim that mine is more
strict and parsimoneous because I only need numbers of dimensions of the
probabilistic entropy and not concepts like differance. The next-order
probability distributions can be considered as the probability of
probability distributions, etc.
 
Best wishes, 
 
 
Loet
  _  

Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR), 
Kloveniersburgwal 48, 1012 CX Amsterdam. 
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
 mailto:l...@leydesdorff.net l...@leydesdorff.net ;
http://www.leydesdorff.net/ http://www.leydesdorff.net/ 

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Derrida's diferAnce and Kolmogorov's Information Operator

2010-02-22 Thread bob logan
Dear Joseph - once again your post was most stimulating, provocative  
and enjoyable. Kolmogorov's definition of information that you quote  
is most interesting but like Shannon's definition incorporates the  
notion that information is a quantitative concept that can be  
measured. The Bateson definition that you refer to was a critique of  
this notion of information as a quantitative measure. The criticism  
began with  MacKay (1969 Information, Mechanism and Meaning.  
Cambridge MA: MIT Press.) who wrote Information is a distinction  
that makes a difference which Bateson (1973 Steps to an Ecology of  
Mind. St. Albans: Paladin Frogmore.) then built on to come up with  
the more popular: Information is a difference that makes a  
difference. MacKay was the first to critique Shannon's quantitative  
definition of information when Shannon wrote his famous definition:   
We have represented a discrete information source as a Markoff  
process. Can we define a quantity, which will measure, in some sense,  
how much information is ‘produced’ by such a process, or better, at  
what rate information is produced? – Shannon (1948 . A mathematical  
theory of communication. Bell System Technical Journal, vol. 27, pp.  
379-423 and 623-656, July and October, 1948.)


According to Claude Shannon (1948, p. 379) his definition of  
information is not connected to its meaning. Weaver concurred in his  
introduction to Shannon’s A Mathematical Theory of Communication when  
he wrote: “Information has ‘nothing to do with meaning’ although it  
does describe a ‘pattern’. Shannon also suggested that information  
in the form of a message often contains meaning but that meaning is  
not a necessary condition for defining information. So it is possible  
to have information without meaning, whatever that means.


Not all of the members of the information science community were  
happy with Shannon’s definition of information. Three years after  
Shannon proposed his definition of information Donald Mackay (1951)  
at the 8th Macy Conference argued for another approach to  
understanding the nature of information. The highly influential Macy  
Conferences on cybernetics, systems theory, information and  
communications were held from 1946 to 1953 during which Norbert  
Wiener’s newly minted cybernetic theory and Shannon’s information  
theory were discussed and debated with a fascinating  
interdisciplinary team of scholars which also included Warren  
McCulloch, Walter Pitts, Gregory Bateson, Margaret Mead, Heinz von  
Foerster, Kurt Lewin and John von Neumann. MacKay argued that he did  
not see “too close a connection between the notion of information as  
we use it in communications engineering and what [we] are doing here…  
the problem here is not so much finding the best encoding of symbols… 
but, rather, the determination of the semantic question of what to  
send and to whom to send it.” He suggested that information should be  
defined as “the change in a receiver’s mind-set, and thus with  
meaning” and not just the sender’s signal (Hayles 1999b, p. 74). The  
notion of information independent of its meaning or context is like  
looking at a figure isolated from its ground. As the ground changes  
so too does the meaning of the figure.


The last two paragraphs are an excerpt from my new book What is  
Information? to be published by the University of Toronto Press in  
late 2010 or early 2011. Your post Joseph has stimulated the  
following thoughts that I hope to add to my new book before it is  
typeset.


As MacKay and Bateson have argued there is a qualitative dimension to  
information not captured by the Shannon Weaver quantitative model nor  
by Kolmogorov's definition. Information is multidimensional. There is  
a quantitative dimension as captured by Shannon and Kolmogorov and a  
qualitative one of meaning as captured by MacKay and Bateson but one  
can think of other dimensions as well. In responding to a  
communication by Joseph Brenner on the Foundations of Information  
(FIS) listserv I described the information that he communicated as  
stimulating, provocative and enjoyable. Brenner cited the following  
Kolmogorov definition of information as “any operator which changes  
the distribution of probabilities in a given set of events.”  
Brenner's information changed the distribution of my mental events to  
one of stimulation, provocation and enjoyment and so there is  
something authentic that this definition of Kolmogorov captures that  
his earlier cited definition of information as the minimum  
computational resources needed to describe a program or a text does  
not. We therefore conclude that not only is there a relativistic  
component to information but it is also multidimensional and not uni- 
dimensional as is the case with Shannon information.


Joseph - many thanks for your stimulating post - I look forward to  
your comments on this riff on your thoughts. - Bob