Normal
  0
  
  
  false
  false
  false
  
   
   
   
   
   
  
  MicrosoftInternetExplorer4
 

 
 




 /* Style Definitions */
 table.MsoNormalTable
        {mso-style-name:"Table Normal";
        mso-tstyle-rowband-size:0;
        mso-tstyle-colband-size:0;
        mso-style-noshow:yes;
        mso-style-parent:"";
        mso-padding-alt:0in 5.4pt 0in 5.4pt;
        mso-para-margin:0in;
        mso-para-margin-bottom:.0001pt;
        mso-pagination:widow-orphan;
        font-size:10.0pt;
        font-family:"Times New Roman";
        mso-ansi-language:#0400;
        mso-fareast-language:#0400;
        mso-bidi-language:#0400;}


 

 
  
 

Dear FIS Colleagues and Friends,

 

As you have for a long time
before me, I have been trying to tame (I prefer the French make private – 
apprivoiser) the notion of information.
One thought was suggested by Bateson’s seemingly generally accepted dictum of
“a difference (and/or distinction) that makes a difference. But I think this
difference is no ordinary “delta”; this is an active referring or better
differing term like the différance of Derrida. I’m sure someone
has made a reference to this before – I’m new here – but then Derrida uses 
différance to question the structure of
binary oppositions, and says that différance “invites us to undo the
need for balanced equations, to see if each term in an opposition is not after
all an accomplice of the other. At the point where the concept of différance
intervenes, all of the conceptual oppositions of metaphysics, to the extent
that they have for ultimate reference the presence of a present
…(signifier/signified; diachrony/synchrony; space/time; passivity/activity,
etc.) become non-pertinent. Since most of the usual debates about information
are based on such conceptual oppositions, and classical notions of here and
now, it may be high time to deconstruct them.

 

I am sure you are familiar with this, but I
found it rather interesting to read that Kolmogorov had given one definition
of information as “any operator which
changes the distribution of probabilities in a given set of events”.
(Apparently, this idea was attacked by Markov.)

 

Différance in the informational context then started looking to me
like an operator, especially since in my process logic, where logical elements
of real processes resemble probabilities, the logical operators are also
processes, such that a predominantly actualized positive implication, for 
example,
is always accompanied by a predominantly potentialized negative implication.

 

At the end of all this, then, one
has, starting from the lowest level:

a)     
information as what is processed by a computer;

b)     
information as a scalar quantity of uncertainty removed,
the entropy/negentropy picture;

c)     
semantic information as well-formed, meaningful data
(Floridi);

d)     
information as a process operator that makes a
difference to and for other processes, including above all those of receivers
and senders.

 

A first useful consequence is
that information “operations” with my operator are naturally polarized,
positive, negative or some combination which I’ll leave open for the moment.
The negative effects of some information follow naturally. Many of you may
conclude I’m doing some oversimplification or conflation, and I apologize for
that in advance. But I believe that Kolmogorov’s original idea has been
neglected in the recent discussions of information I’ve seen, and I would very
much welcome comments. Thank you and best wishes.

 

Joseph  



_______________________________________________
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to