Dear Joe, 
 
5. The description of differences in terms of levels of complexity and
recursion affecting Shannon-type information is essential because it
provides an analytical basis of meaning also. Perhaps the sequence goes from
vector to tensor to spinor (?) as you go up in dimensionality of the entropy
to yield valuedness or valence? 

 

Yes, it yields valuedness because the differences(1) make a difference(2),
etc. For example, the information contained in a vector is positioned in the
network/matrix, and this position has a value. The operation is recursive.
But it closes itself off at the level of four.

 

First, there are only difference(s)(1): expected information content of a
distribution. This can make a difference(2) to an extension. Difference(3)
when this is not only once, but repeated over time, that is, in a
three-dimensional array. (Is that a 3-dimensional tensor?) In the next
recursion, difference(4) has the additional degree of freedom of playing
with the direction of time: incursion versus recursion becomes possible.

 

Let us reformulate this in terms of evolution theory: differences(1) is only
variation. Difference(2) positions the variation selectively. The structure
of the system determines the value of the variation. Difference(3) adds the
time axis and therefore stabilization: some selections are selected for
stabilization. Difference(4) adds globalization: some stabilizations are
selected for globalization. Globalization means that a next-order systems
level folds back on the system, closes it of, and makes it a possible
carrier for a next order systems dynamics.

 

In other words: stabilizations can be at variance and thus provide a
next-order variation with reference to difference(1). Difference(4) can
analogously be considered as a next-order selection mechanism. But the
system now already contains time (difference(3)) and performs by using also
time as a degree of freedom. The monad is constituted. It closes off
--performing its own autopoiesis-- but remains open in terms of its
stablizations (= second order variations) for other systems dimensions to
build further upon. Because of its fourth dimension it is not subsumed but
remains as an independent reality.

 

Is this consonant with Logic in Reality? 

 

Best wishes, 

 

 

Loet

 
  _____  

Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR), 
Kloveniersburgwal 48, 1012 CX Amsterdam. 
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
 <mailto:l...@leydesdorff.net> l...@leydesdorff.net ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 

 



  _____  

From: fis-boun...@listas.unizar.es [mailto:fis-boun...@listas.unizar.es] On
Behalf Of joe.bren...@bluewin.ch
Sent: Tuesday, February 23, 2010 7:33 PM
To: lo...@physics.utoronto.ca
Cc: fis
Subject: Re: [Fis] Derrida's "diferAnce" and Kolmogorov's Information
Operator



Dear Bob, Loet, Gyuri and All,



Progress?! Between Bob, Loet, and something of my logical approach, I see
the “art of understanding information” developing in its necessarily
dialectically connected synthetic and analytical aspects. Here are a few of
the ideas suggested by Bob’s historical notes, very useful for me, and by
Loet’s elaboration of the complexity of difference.



1. The quantitative characteristics of information are more or less clear.
My list of definitions was not intended to be exhaustive.



2. The semantic question of MacKay of what to send and where to send it is a
process taking place in the sender’s mind. His definition of information as
the change in a receiver’s mind-set and thus (concerned) with meaning also
describes a dynamic process. He should have added simply that there is, when
the signal is finally sent, a change in the sender’s “mind-set” also. These
relations and changes can be described in my logical terms.



3. The Gestalt description of sufficiently complex information and meaning
connected as figure and ground should have been obvious to me long ago, it
wasn’t, but it certainly is now. Logic in Reality provides a principled
dynamic description of the linked changes of figure and ground, alternately
predominating in the mind in two dimensions. The analogy is not perfect,
however. One needs to keep in mind, here, the vertical, inter-level relation
between information and meaning. It is this kind of information, and that in
point 2., that I would like to describe as logical information operators. 



4. Information without meaning, (Bob’s paragraph 2) is information that is
incapable of making a direct causal difference, to all intents and purposes,
such as a data base. 



5. The description of differences in terms of levels of complexity and
recursion affecting Shannon-type information is essential because it
provides an analytical basis of meaning also. Perhaps the sequence goes from
vector to tensor to spinor (?) as you go up in dimensionality of the entropy
to yield valuedness or valence? 



6. The concept of allegedly self-organizing, autonomous and autopoïetic
systems, however, requires the further explication of the origin of these
wonderful properties in reality. Loet’s statement that the two approaches
are “very akin” is very welcome in the analytic domain, since it is indeed
more strict and parsimonious and différance is only a philosophical concept.
However, différance is in a sense directly related to complex real physical
systems, such as information producers and receivers. Ascription of
autonomy, etc. where it does exist complicates things. I am trying to get a
handle on information as an ontological operators, not one related to
epistemic or doxastic differences. 



7, From this perspective, a clarification to Gyuri’s note about the “simple
form of information” to which he gives the very intriguing designation
“information on existence”. Everything that has to do with existence is of
interest to me, but my Logic in Reality does not and is not intended to
apply to the entire extant domain. The examples you give, Gyuri, are binary
or in other cases in a one-to-many relation. There is existence here,
double-valuedness and information and at an even more fundamental level
parity, as a property of quantum entities. But this is not binary
opposition; there is no opposition here, no exchange of energy, no caused
differences. At the limit, there is no physical change at all of the duals
as such, or if there is, it is only of the state-transition type (cf. Loet’s
Y/N, F/T, open-closed and your same color – different color, presence -
absence). So by all means let’s discuss the concept of existence type
information,. For me the test of its utility would be the extent to which it
might apply to or be included in (as lowest level semantic information is)
something complex and interactive.



Many thanks.



Best,



Joseph



----Message d'origine----
De: lo...@physics.utoronto.ca
Date: 22.02.2010 15:07
À: <joe.bren...@bluewin.ch>
Copie: "Pedro Clemente Marijuan Fernandez"<pcmarijuan.i...@aragon.es>,
"fis"<fis@listas.unizar.es>
Objet: Re: [Fis] Derrida's "diferAnce" and Kolmogorov's Information Operator

Dear Joseph - once again your post was most stimulating, provocative and
enjoyable. Kolmogorov's definition of information that you quote is most
interesting but like Shannon's definition incorporates the notion that
information is a quantitative concept that can be measured. The Bateson
definition that you refer to was a critique of this notion of information as
a quantitative measure. The criticism began with  MacKay (1969 Information,
Mechanism and Meaning. Cambridge MA: MIT Press.) who wrote "Information is a
distinction that makes a difference" which Bateson (1973 Steps to an Ecology
of Mind. St. Albans: Paladin Frogmore.) then built on to come up with the
more popular: "Information is a difference that makes a difference". MacKay
was the first to critique Shannon's quantitative definition of information
when Shannon wrote his famous definition:  "We have represented a discrete
information source as a Markoff process. Can we define a quantity, which
will measure, in some sense, how much information is ‘produced’ by such a
process, or better, at what rate information is produced?" – Shannon (1948 .
A mathematical theory of communication. Bell System Technical Journal, vol.
27, pp. 379-423 and 623-656, July and October, 1948.)  


According to Claude Shannon (1948, p. 379) his definition of information is
not connected to its meaning. Weaver concurred in his introduction to
Shannon’s A Mathematical Theory of Communication when he wrote: “Information
has ‘nothing to do with meaning’ although it does describe a ‘pattern’."
Shannon also suggested that information in the form of a message often
contains meaning but that meaning is not a necessary condition for defining
information. So it is possible to have information without meaning, whatever
that means. 

Not all of the members of the information science community were happy with
Shannon’s definition of information. Three years after Shannon proposed his
definition of information Donald Mackay (1951) at the 8th Macy Conference
argued for another approach to understanding the nature of information. The
highly influential Macy Conferences on cybernetics, systems theory,
information and communications were held from 1946 to 1953 during which
Norbert Wiener’s newly minted cybernetic theory and Shannon’s information
theory were discussed and debated with a fascinating interdisciplinary team
of scholars which also included Warren McCulloch, Walter Pitts, Gregory
Bateson, Margaret Mead, Heinz von Foerster, Kurt Lewin and John von Neumann.
MacKay argued that he did not see “too close a connection between the notion
of information as we use it in communications engineering and what [we] are
doing here… the problem here is not so much finding the best encoding of
symbols…but, rather, the determination of the semantic question of what to
send and to whom to send it.” He suggested that information should be
defined as “the change in a receiver’s mind-set, and thus with meaning” and
not just the sender’s signal (Hayles 1999b, p. 74). The notion of
information independent of its meaning or context is like looking at a
figure isolated from its ground. As the ground changes so too does the
meaning of the figure.

The last two paragraphs are an excerpt from my new book What is Information?
to be published by the University of Toronto Press in late 2010 or early
2011. Your post Joseph has stimulated the following thoughts that I hope to
add to my new book before it is typeset.

As MacKay and Bateson have argued there is a qualitative dimension to
information not captured by the Shannon Weaver quantitative model nor by
Kolmogorov's definition. Information is multidimensional. There is a
quantitative dimension as captured by Shannon and Kolmogorov and a
qualitative one of meaning as captured by MacKay and Bateson but one can
think of other dimensions as well. In responding to a communication by
Joseph Brenner on the Foundations of Information (FIS) listserv I described
the information that he communicated as stimulating, provocative and
enjoyable. Brenner cited the following Kolmogorov definition of information
as “any operator which changes the distribution of probabilities in a given
set of events.” Brenner's information changed the distribution of my mental
events to one of stimulation, provocation and enjoyment and so there is
something authentic that this definition of Kolmogorov captures that his
earlier cited definition of information as "the minimum computational
resources needed to describe a program or a text" does not. We therefore
conclude that not only is there a relativistic component to information but
it is also multidimensional and not uni-dimensional as is the case with
Shannon information.

Joseph - many thanks for your stimulating post - I look forward to your
comments on this riff on your thoughts. - Bob


______________________ 

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 
www.physics.utoronto.ca/Members/logan

On 22-Feb-10, at 12:43 AM, joe.bren...@bluewin.ch wrote:


Dear FIS Colleagues and Friends,



As you have for a long time before me, I have been trying to tame (I prefer
the French make private – apprivoiser) the notion of information. One
thought was suggested by Bateson’s seemingly generally accepted dictum of “a
difference (and/or distinction) that makes a difference. But I think this
difference is no ordinary “delta”; this is an active referring or better
differing term like the différance of Derrida. I’m sure someone has made a
reference to this before – I’m new here – but then Derrida uses différance
to question the structure of binary oppositions, and says that différance
“invites us to undo the need for balanced equations, to see if each term in
an opposition is not after all an accomplice of the other. At the point
where the concept of différance intervenes, all of the conceptual
oppositions of metaphysics, to the extent that they have for ultimate
reference the presence of a present …(signifier/signified;
diachrony/synchrony; space/time; passivity/activity, etc.) become
non-pertinent. Since most of the usual debates about information are based
on such conceptual oppositions, and classical notions of here and now, it
may be high time to deconstruct them.



I am sure you are familiar with this, but I found it rather interesting to
read that Kolmogorov had given one definition of information as “any
operator which changes the distribution of probabilities in a given set of
events”. (Apparently, this idea was attacked by Markov.)



Différance in the informational context then started looking to me like an
operator, especially since in my process logic, where logical elements of
real processes resemble probabilities, the logical operators are also
processes, such that a predominantly actualized positive implication, for
example, is always accompanied by a predominantly potentialized negative
implication.



At the end of all this, then, one has, starting from the lowest level:

<!--[if !supportLists]-->a)<!--[endif]-->information as what is processed by
a computer;

<!--[if !supportLists]-->b)<!--[endif]-->information as a scalar quantity of
uncertainty removed, the entropy/negentropy picture;

<!--[if !supportLists]-->c)<!--[endif]-->semantic information as
well-formed, meaningful data (Floridi);

<!--[if !supportLists]-->d)<!--[endif]-->information as a process operator
that makes a difference to and for other processes, including above all
those of receivers and senders.



A first useful consequence is that information “operations” with my operator
are naturally polarized, positive, negative or some combination which I’ll
leave open for the moment. The negative effects of some information follow
naturally. Many of you may conclude I’m doing some oversimplification or
conflation, and I apologize for that in advance. But I believe that
Kolmogorov’s original idea has been neglected in the recent discussions of
information I’ve seen, and I would very much welcome comments. Thank you and
best wishes.



Joseph

_______________________________________________
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis




_______________________________________________
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to