Re: [Fis] Derrida's "diferAnce" and Kolmogorov's Information Operator

2010-02-23 Thread Loet Leydesdorff
Dear Joe, 
 
5. The description of differences in terms of levels of complexity and
recursion affecting Shannon-type information is essential because it
provides an analytical basis of meaning also. Perhaps the sequence goes from
vector to tensor to spinor (?) as you go up in dimensionality of the entropy
to yield valuedness or valence? 

 

Yes, it yields valuedness because the differences(1) make a difference(2),
etc. For example, the information contained in a vector is positioned in the
network/matrix, and this position has a value. The operation is recursive.
But it closes itself off at the level of four.

 

First, there are only difference(s)(1): expected information content of a
distribution. This can make a difference(2) to an extension. Difference(3)
when this is not only once, but repeated over time, that is, in a
three-dimensional array. (Is that a 3-dimensional tensor?) In the next
recursion, difference(4) has the additional degree of freedom of playing
with the direction of time: incursion versus recursion becomes possible.

 

Let us reformulate this in terms of evolution theory: differences(1) is only
variation. Difference(2) positions the variation selectively. The structure
of the system determines the value of the variation. Difference(3) adds the
time axis and therefore stabilization: some selections are selected for
stabilization. Difference(4) adds globalization: some stabilizations are
selected for globalization. Globalization means that a next-order systems
level folds back on the system, closes it of, and makes it a possible
carrier for a next order systems dynamics.

 

In other words: stabilizations can be at variance and thus provide a
next-order variation with reference to difference(1). Difference(4) can
analogously be considered as a next-order selection mechanism. But the
system now already contains time (difference(3)) and performs by using also
time as a degree of freedom. The monad is constituted. It closes off
--performing its own autopoiesis-- but remains open in terms of its
stablizations (= second order variations) for other systems dimensions to
build further upon. Because of its fourth dimension it is not subsumed but
remains as an independent reality.

 

Is this consonant with Logic in Reality? 

 

Best wishes, 

 

 

Loet

 
  _  

Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR), 
Kloveniersburgwal 48, 1012 CX Amsterdam. 
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
 <mailto:l...@leydesdorff.net> l...@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 

 



  _  

From: fis-boun...@listas.unizar.es [mailto:fis-boun...@listas.unizar.es] On
Behalf Of joe.bren...@bluewin.ch
Sent: Tuesday, February 23, 2010 7:33 PM
To: lo...@physics.utoronto.ca
Cc: fis
Subject: Re: [Fis] Derrida's "diferAnce" and Kolmogorov's Information
Operator



Dear Bob, Loet, Gyuri and All,



Progress?! Between Bob, Loet, and something of my logical approach, I see
the “art of understanding information” developing in its necessarily
dialectically connected synthetic and analytical aspects. Here are a few of
the ideas suggested by Bob’s historical notes, very useful for me, and by
Loet’s elaboration of the complexity of difference.



1. The quantitative characteristics of information are more or less clear.
My list of definitions was not intended to be exhaustive.



2. The semantic question of MacKay of what to send and where to send it is a
process taking place in the sender’s mind. His definition of information as
the change in a receiver’s mind-set and thus (concerned) with meaning also
describes a dynamic process. He should have added simply that there is, when
the signal is finally sent, a change in the sender’s “mind-set” also. These
relations and changes can be described in my logical terms.



3. The Gestalt description of sufficiently complex information and meaning
connected as figure and ground should have been obvious to me long ago, it
wasn’t, but it certainly is now. Logic in Reality provides a principled
dynamic description of the linked changes of figure and ground, alternately
predominating in the mind in two dimensions. The analogy is not perfect,
however. One needs to keep in mind, here, the vertical, inter-level relation
between information and meaning. It is this kind of information, and that in
point 2., that I would like to describe as logical information operators. 



4. Information without meaning, (Bob’s paragraph 2) is information that is
incapable of making a direct causal difference, to all intents and purposes,
such as a data base. 



5. The description of differences in terms of levels of complexity and
recursion affecting Shannon-type information is essential because it
provides an analytical basis of meaning also. Perhaps the sequence goes from
vector to tensor to spinor (?) as you go up in dimensionality of the entropy
to yield valuedness or valence? 



6. The conc

Re: [Fis] Derrida's "diferAnce" and Kolmogorov's Information Operator

2010-02-23 Thread joe.bren...@bluewin.ch





 
  Normal
  0
  
  
  false
  false
  false
  
   
   
   
   
   
  
  MicrosoftInternetExplorer4
 

 
 




 /* Style Definitions */
 table.MsoNormalTable
{mso-style-name:"Table Normal";
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-parent:"";
mso-padding-alt:0in 5.4pt 0in 5.4pt;
mso-para-margin:0in;
mso-para-margin-bottom:.0001pt;
mso-pagination:widow-orphan;
font-size:10.0pt;
font-family:"Times New Roman";
mso-ansi-language:#0400;
mso-fareast-language:#0400;
mso-bidi-language:#0400;}



Dear Bob, Loet, Gyuri and All,

 

Progress?! Between Bob, Loet, and
something of my logical approach, I see the “art of understanding information”
developing in its necessarily dialectically connected synthetic and analytical
aspects. Here are a few of the ideas suggested by Bob’s historical notes, very
useful for me, and by Loet’s elaboration of the complexity of difference.

 

1. The quantitative
characteristics of information are more or less clear. My list of definitions
was not intended to be exhaustive.

 

2. The semantic question of
MacKay of what to send and where to send it is a process taking place in the
sender’s mind. His definition of information as the change in a receiver’s
mind-set and thus (concerned) with meaning also describes a dynamic process. He
should have added simply that there is, when the signal is finally sent, a
change in the sender’s “mind-set” also. These relations and changes can be
described in my logical terms.

 

3. The Gestalt description of sufficiently complex information and meaning
connected as figure and ground should have been obvious to me long ago, it
wasn’t, but it certainly is now. Logic in Reality provides a principled dynamic
description of the linked changes of figure and ground, alternately
predominating in the mind in two dimensions. The analogy is not perfect,
however. One needs to keep in mind, here, the vertical, inter-level relation
between information and meaning. It is this kind of information, and that in
point 2., that I would like to describe as logical information operators. 

 

4. Information without meaning,
(Bob’s paragraph 2) is information that is incapable of making a direct causal 
difference, to all intents
and purposes, such as a data base. 

 

5. The description of differences
in terms of levels of complexity and recursion affecting Shannon-type
information is essential because it provides an analytical basis of meaning 
also. Perhaps the sequence goes from
vector to tensor to spinor (?) as you go up in dimensionality of the entropy to
yield valuedness or valence? 

 

6. The concept of allegedly self-organizing,
autonomous and autopoïetic systems, however, requires the further explication
of the origin of these wonderful properties in reality. Loet’s statement that
the two approaches are “very akin” is very welcome in the analytic domain,
since it is indeed more strict and parsimonious and différance is only a 
philosophical concept. However, différance is in a sense directly related
to complex real physical systems, such as information producers and receivers. 
Ascription
of autonomy, etc. where it does exist complicates things. I am trying to get a
handle on information as an ontological operators, not one related to epistemic
or doxastic differences. 

 

7, From this perspective, a
clarification to Gyuri’s note about the “simple form of information” to which
he gives the very intriguing designation “information on existence”. Everything
that has to do with existence is of interest to me, but my Logic in Reality does
not and is not intended to apply to the entire extant domain. The examples you
give, Gyuri, are binary or in other cases in a one-to-many relation. There is
existence here, double-valuedness and information and at an even more
fundamental level parity, as a property of quantum entities. But this is not 
binary
opposition; there is no opposition
here, no exchange of energy, no caused differences. At the limit, there is no 
physical
change at all of the duals as such, or if there is, it is only of the 
state-transition type (cf.
Loet’s Y/N, F/T, open-closed and your same color – different color, presence -
absence). So by all means let’s discuss the concept of existence type
information,. For me the test of its utility would be the extent to which it
might apply to or be included in (as lowest level semantic information is) 
something complex and interactive.

 

Many thanks.

 

Best,

 

Joseph






Message d'origine

De: lo...@physics.utoronto.ca

Date: 22.02.2010 15:07

À: 

Copie: "Pedro Clemente Marijuan Fernandez", 
"fis"

Objet: Re: [Fis] Derrida's "diferAnce" and Kolmogorov's Information Operator



Dear Joseph - once again your post was most stimulating, provocative and 
enjoyable.

Re: [Fis] Derrida's "diferAnce" and Kolmogorov's Information Operator

2010-02-22 Thread bob logan
Dear Joseph - once again your post was most stimulating, provocative  
and enjoyable. Kolmogorov's definition of information that you quote  
is most interesting but like Shannon's definition incorporates the  
notion that information is a quantitative concept that can be  
measured. The Bateson definition that you refer to was a critique of  
this notion of information as a quantitative measure. The criticism  
began with  MacKay (1969 Information, Mechanism and Meaning.  
Cambridge MA: MIT Press.) who wrote "Information is a distinction  
that makes a difference" which Bateson (1973 Steps to an Ecology of  
Mind. St. Albans: Paladin Frogmore.) then built on to come up with  
the more popular: "Information is a difference that makes a  
difference". MacKay was the first to critique Shannon's quantitative  
definition of information when Shannon wrote his famous definition:   
"We have represented a discrete information source as a Markoff  
process. Can we define a quantity, which will measure, in some sense,  
how much information is ‘produced’ by such a process, or better, at  
what rate information is produced?" – Shannon (1948 . A mathematical  
theory of communication. Bell System Technical Journal, vol. 27, pp.  
379-423 and 623-656, July and October, 1948.)


According to Claude Shannon (1948, p. 379) his definition of  
information is not connected to its meaning. Weaver concurred in his  
introduction to Shannon’s A Mathematical Theory of Communication when  
he wrote: “Information has ‘nothing to do with meaning’ although it  
does describe a ‘pattern’." Shannon also suggested that information  
in the form of a message often contains meaning but that meaning is  
not a necessary condition for defining information. So it is possible  
to have information without meaning, whatever that means.


Not all of the members of the information science community were  
happy with Shannon’s definition of information. Three years after  
Shannon proposed his definition of information Donald Mackay (1951)  
at the 8th Macy Conference argued for another approach to  
understanding the nature of information. The highly influential Macy  
Conferences on cybernetics, systems theory, information and  
communications were held from 1946 to 1953 during which Norbert  
Wiener’s newly minted cybernetic theory and Shannon’s information  
theory were discussed and debated with a fascinating  
interdisciplinary team of scholars which also included Warren  
McCulloch, Walter Pitts, Gregory Bateson, Margaret Mead, Heinz von  
Foerster, Kurt Lewin and John von Neumann. MacKay argued that he did  
not see “too close a connection between the notion of information as  
we use it in communications engineering and what [we] are doing here…  
the problem here is not so much finding the best encoding of symbols… 
but, rather, the determination of the semantic question of what to  
send and to whom to send it.” He suggested that information should be  
defined as “the change in a receiver’s mind-set, and thus with  
meaning” and not just the sender’s signal (Hayles 1999b, p. 74). The  
notion of information independent of its meaning or context is like  
looking at a figure isolated from its ground. As the ground changes  
so too does the meaning of the figure.


The last two paragraphs are an excerpt from my new book What is  
Information? to be published by the University of Toronto Press in  
late 2010 or early 2011. Your post Joseph has stimulated the  
following thoughts that I hope to add to my new book before it is  
typeset.


As MacKay and Bateson have argued there is a qualitative dimension to  
information not captured by the Shannon Weaver quantitative model nor  
by Kolmogorov's definition. Information is multidimensional. There is  
a quantitative dimension as captured by Shannon and Kolmogorov and a  
qualitative one of meaning as captured by MacKay and Bateson but one  
can think of other dimensions as well. In responding to a  
communication by Joseph Brenner on the Foundations of Information  
(FIS) listserv I described the information that he communicated as  
stimulating, provocative and enjoyable. Brenner cited the following  
Kolmogorov definition of information as “any operator which changes  
the distribution of probabilities in a given set of events.”  
Brenner's information changed the distribution of my mental events to  
one of stimulation, provocation and enjoyment and so there is  
something authentic that this definition of Kolmogorov captures that  
his earlier cited definition of information as "the minimum  
computational resources needed to describe a program or a text" does  
not. We therefore conclude that not only is there a relativistic  
component to information but it is also multidimensional and not uni- 
dimensional as is the case with Shannon information.


Joseph - many thanks for your stimulating post - I look forward to  
your comments on this riff on your thoughts. - Bob

___

Re: [Fis] Derrida's "diferAnce" and Kolmogorov's Information Operator

2010-02-22 Thread Loet Leydesdorff
At the end of all this, then, one has, starting from the lowest level:

a)  information as what is
processed by a computer;

b)  information as a scalar
quantity of uncertainty removed, the entropy/negentropy picture;

c)  semantic information as
well-formed, meaningful data (Floridi);

d)  information as a process
operator that makes a difference to and for other processes, including above
all those of receivers and senders.



Dear Joseph and colleagues, 
 
I agree with the distinction of four operations, but it seems to me that
this can be expressed more parsimoneously using information theory. Given
Bateson's (1972) formulation that information can be considered as "a
difference which makes a difference",  one should distinguish between the
first type of differences and the second. Let's say difference(1) and
difference(2). (I'll need difference(3) and difference(4) below.)
 
A difference(1) can only make a difference(2) for a system (or more
generally the expectation of a system). This difference(2) is analytically
preceded by difference(1), that is, pure differences. Shannon-type
information is contained in probability distributions. In the binary case,
this is only one difference (Y/N, F/T, open/closed); in the non-binary case
probability distributions provide us with sets of differences(1). These
differences(1) can only make a difference(2) for a system which contains
other (orthogonal) differences. In this case one needs one-more (orthogonal)
dimension of the probability distribution that positions the incoming
(Shannon-type) information at specific moments in time. Thus, difference(2)
presumes at least a dimensionality of two in the probabilistic entropy.
 
When the system develops, difference(3) can be defined with reference to the
time axis (recursion). This is Brillouin's (1962) Delta H. The difference(1)
that made a difference(2) for the system makes a difference(3) over time.
When the system operates as a self-organizing, autonomous or autopoietic
system it is additionally able to provide the information with a meaning
from the perspective of hindsight, that is, against the axis of time. This
"incursion" can make a difference(4). 
 
In other words, one needs at least a vector (one dimension of the entropy)
for containing an uncertainty. One needs (at least) two dimensions of the
probabilistic entropy for positioning the information in a network (matrix)
at specific moments of time. Three dimensions are needed when the time axis
is additionally included; four when the direction in the time axis can be
considered as another degree of freedom.
 
The two approaches seem very akin to me, but I claim that mine is more
strict and parsimoneous because I only need numbers of dimensions of the
probabilistic entropy and not concepts like differance. The next-order
probability distributions can be considered as the probability of
probability distributions, etc.
 
Best wishes, 
 
 
Loet
  _  

Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR), 
Kloveniersburgwal 48, 1012 CX Amsterdam. 
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
  l...@leydesdorff.net ;
 http://www.leydesdorff.net/ 

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Derrida's "diferAnce" and Kolmogorov's Information Operator

2010-02-22 Thread Darvas Gyorgy
Dear Joe, Dear FISers,

I would like to add a short remark.
In fact, I repeat a fact that I mentioned several times in the past.
We often miss to list a very simple form of 
information among the others, namely that, what I 
call information on existence.
This is a double-valued property, with the words 
of Joe, "binary opposition". Something exists 
here and now, or does not. E.g., I am here or I 
am not here; you are hungry or you are not 
hungry; it is red or not red, that object has the 
same colour like this, or has "different" colour, etc.
(Note, difference is not certainly a binary 
category in itself, because it compares two 
things, where one of the compared things has one 
property, while the other side - asymmetrically - 
is many-valued. These many values may be finite, 
discrete infinite, or may belong to a smooth continuum.)

This observation comes for me from my experience 
with the description of symmetries in physics. 
For decades, there were described different kinds 
of symmetry by discrete groups, then by 
(continuous) Lie groups - more and more 
complicated appearances of symmetry phenomena. 
The significance of parity was recognised late, 
in the nineteen fifties, and I would say, that 
there are even now   (fortunately not all, only) 
many physicists who are surprised when meet the 
difference between the behaviour of parity in odd 
and even dimensional spaces. This is caused by 
lack of general knowledge about the nature of 
paritiy. Nevertheless, the most simple appearance of the phenomenon.

Something similar happened around existence type information.
It is important, it is lovely, worthy of 
affection. Let us not forget/neglect it.
Gyuri



At 06:43 22.02.2010, you wrote:
>Dear FIS Colleagues and Friends,
>
>As you have for a long time before me, I have 
>been trying to tame (I prefer the French make 
>private – apprivoiser) the notion of 
>information. One thought was suggested by 
>Bateson’s seemingly generally accepted dictum 
>of “a difference (and/or distinction) that 
>makes a difference. But I think this difference 
>is no ordinary “delta”; this is an active 
>referring or better differing term like the 
>différance of Derrida. I’m sure someone has 
>made a reference to this before – I’m new here 
> “ but then Derrida uses différance to 
>question the structure of binary oppositions, 
>and says that différance “invites us to undo 
>the need for balanced equations, to see if each 
>term in an opposition is not after all an 
>accomplice of the other. At the point where the 
>concept of différance intervenes, all of the 
>conceptual oppositions of metaphysics, to the 
>extent that they have for ultimate reference the 
>presence of a present …(signifier/signified; 
>diachrony/synchrony; space/time; 
>passivity/aactivity, etc.) become non-pertinent. 
>Since most of the usual debates about 
>information are based on such conceptual 
>oppositions, and classical notions of here and 
>now, it may be high time to deconstruct them.
>
>I am sure you are familiar with this, but I 
>found it rather interesting to read that 
>Kolmogorov had given one definition of 
>information as “any operator which changes the 
>distribution of probabilities in a given set of 
>events”. (Apparently, this idea was attacked by Markov.)
>
>Différance in the informational context then 
>started looking to me like an operator, 
>especially since in my process logic, where 
>logical elements of real processes resemble 
>probabilities, the logical operators are also 
>processes, such that a predominantly actualized 
>positive implication, for example, is always 
>accompanied by a predominantly potentialized negative implication.
>
>At the end of all this, then, one has, starting from the lowest level:
>a)  information as what is processed by a computer;
>b)  information as a scalar quantity of 
>uncertainty removed, the entropy/negentropy picture;
>c)  semantic information as well-formed, meaningful data (Floridi);
>d)  information as a process operator that 
>makes a difference to and for other processes, 
>including above all those of receivers and senders.
>
>A first useful consequence is that information 
>“operations” with my operator are naturally 
>polarized, positive, negative or some 
>combination which I’ll leave open for the 
>moment. The negative effects of some information 
>follow naturally. Many of you may conclude I’m 
>doing some oversimplification or conflation, and 
>I apologize for that in advance. But I believe 
>that Kolmogorov’s original idea has been 
>neglected in the recent discussions of 
>information I’ve seen, and I would very much 
>welcome comments. Thank you and best wishes.
>
>Joseph
>___
>fis mailing list
>fis@listas.unizar.es
>https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

Bridges 
Conference 2010, Pecs

[Fis] Derrida's "diferAnce" and Kolmogorov's Information Operator

2010-02-21 Thread joe.bren...@bluewin.ch




 
  Normal
  0
  
  
  false
  false
  false
  
   
   
   
   
   
  
  MicrosoftInternetExplorer4
 

 
 




 /* Style Definitions */
 table.MsoNormalTable
{mso-style-name:"Table Normal";
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-parent:"";
mso-padding-alt:0in 5.4pt 0in 5.4pt;
mso-para-margin:0in;
mso-para-margin-bottom:.0001pt;
mso-pagination:widow-orphan;
font-size:10.0pt;
font-family:"Times New Roman";
mso-ansi-language:#0400;
mso-fareast-language:#0400;
mso-bidi-language:#0400;}


 

 
  
 

Dear FIS Colleagues and Friends,

 

As you have for a long time
before me, I have been trying to tame (I prefer the French make private – 
apprivoiser) the notion of information.
One thought was suggested by Bateson’s seemingly generally accepted dictum of
“a difference (and/or distinction) that makes a difference. But I think this
difference is no ordinary “delta”; this is an active referring or better
differing term like the différance of Derrida. I’m sure someone
has made a reference to this before – I’m new here – but then Derrida uses 
différance to question the structure of
binary oppositions, and says that différance “invites us to undo the
need for balanced equations, to see if each term in an opposition is not after
all an accomplice of the other. At the point where the concept of différance
intervenes, all of the conceptual oppositions of metaphysics, to the extent
that they have for ultimate reference the presence of a present
…(signifier/signified; diachrony/synchrony; space/time; passivity/activity,
etc.) become non-pertinent. Since most of the usual debates about information
are based on such conceptual oppositions, and classical notions of here and
now, it may be high time to deconstruct them.

 

I am sure you are familiar with this, but I
found it rather interesting to read that Kolmogorov had given one definition
of information as “any operator which
changes the distribution of probabilities in a given set of events”.
(Apparently, this idea was attacked by Markov.)

 

Différance in the informational context then started looking to me
like an operator, especially since in my process logic, where logical elements
of real processes resemble probabilities, the logical operators are also
processes, such that a predominantly actualized positive implication, for 
example,
is always accompanied by a predominantly potentialized negative implication.

 

At the end of all this, then, one
has, starting from the lowest level:

a) 
information as what is processed by a computer;

b) 
information as a scalar quantity of uncertainty removed,
the entropy/negentropy picture;

c) 
semantic information as well-formed, meaningful data
(Floridi);

d) 
information as a process operator that makes a
difference to and for other processes, including above all those of receivers
and senders.

 

A first useful consequence is
that information “operations” with my operator are naturally polarized,
positive, negative or some combination which I’ll leave open for the moment.
The negative effects of some information follow naturally. Many of you may
conclude I’m doing some oversimplification or conflation, and I apologize for
that in advance. But I believe that Kolmogorov’s original idea has been
neglected in the recent discussions of information I’ve seen, and I would very
much welcome comments. Thank you and best wishes.

 

Joseph  



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis