Dear Bob,

In your Shannon Exonerata paper you have an example of three strings, their 
entropies and their mutual information. I very much admire this paper and 
particularly the critique  of Shannon and the emphasis on the apophatic, but 
some things puzzle me. If these are strings of a living thing, then we can 
assume that these strings grow over time. If sequences A,B and C are related, 
then the growth of one is dependent on the growth of the other. This process 
occurs in time. During the growth of the strings, even the determination of 
what is and is not surprising changes with the distinction between what is seen 
to be the same and what isn't.

 I have begun to think that it's the relative entropy between growing things 
(whether biological measurements, lines of musical counterpoint, learning) that 
matters. Particularly as mutual information is a variety of relative entropy. 
There are dynamics in the interactions. A change in entropy for one string with 
no change in entropy in the others (melody and accompaniment) is distinct from 
everything changing at the same time (that's "death and transfiguration"!). 

Shannon's formula isn't good at measuring change in entropy. It's less good 
with changes in distinctions which occur at critical moments ("aha! A 
discovery!" Or "this is no longer surprising") The best that we might do, I've 
thought, is segment your strings over time and examine relative entropies. I've 
done this with music. Does anyone have any other techniques?

On the apophatic, I can imagine a study of the dynamics of Ashby's homeostat 
where each unit produced one of your strings. The machine comes to its solution 
when the entropies of the dials are each 0 (redundancy 1) As the machine 
approaches its equilibrium, the constraint of each dial on every other can be 
explored by the relative entropies between the dials. If we wanted the machine 
to keep on searching and not settle, it's conceivable that you might add more 
dials into the mechanism as its relative entropy started to approach 0. What 
would this do? It would maintain a counterpoint in the relative entropies 
within the ensemble. Would adding the dial increase the apophasis? Or the 
entropy? Or the relative entropy? 

Best wishes,

Mark 

-----Original Message-----
From: "Robert E. Ulanowicz" <u...@umces.edu>
Sent: ‎09/‎10/‎2017 15:20
To: "Mark Johnson" <johnsonm...@gmail.com>
Cc: "foundationofinformationscience" <fis@listas.unizar.es>
Subject: Re: [Fis] Data - Reflection - Information


> A perspectival shift can help of the kind that Gregory Bateson once talked
> about. When we look at a hand, do we see five fingers or four spaces?
> Discourses are a bit like fingers, aren't they?

Mark,

The absence of the absent was a major theme of Bateson's, and he
criticized physics for virtually neglecting the missing.

Fortunately, IT is predicated on the missing, and quantitative information
is a double negative (the reverse of what is missing). This makes IT a
prime tool for broadening our scope on reality.
<https://people.clas.ufl.edu/ulan/files/FISPAP.pdf> &
<https://people.clas.ufl.edu/ulan/files/Reckon.pdf>

Bob

_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to