Re: [Fis] Is information physical?

2018-04-25 Thread Ulanowicz, Robert
Dear Mark,

I share your inclination, albeit from a different perspective.

Consider the two statements:

1. Information is impossible without a physical carrier.

2. Information is impossible without the influence of that which does not exist.

There is significant truth in both statements.

I know that Claude Shannon is not a popular personality on FIS, but I
admire how he first approached the subject. He began by quantifying,
not information in the intuitive, positivist  sense, but rather the
*lack* of information, or "uncertainty", as he put it. Positivist
information thereby becomes a double negative -- any decrease in
uncertainty.

In short, the quantification of information begins by quantifying
something that does not exist, but nonetheless is related to that
which does. Terry calls this lack the "absential", I call it the
"apophatic" and it is a major player in living systems!

Karl Popper finished his last book with the exhortation that we need
to develop a "calculus of conditional probabilities". Well, that
effort was already underway in information theory. Using conditional
probabilities allows one to parse Shannon's formula for diversity into
two terms -- on being positivist information (average mutual
information) and the other apophasis (conditional entropy).


This duality in nature is evident but often unnoticed in the study of
networks. Most look at networks and immediately see the constraints
between nodes. And so it is. But there is also indeterminacy in almost
all real networks, and this often is disregarded. The proportions
between constraint and indeterminacy can readily be calculated.

What is important in living systems (and I usually think of the more
indeterminate ecosystems, rather than organisms [but the point applies
there as well]) is that some degree of conditional entropy is
absolutely necessary for systems sustainability, as it provides the
flexibility required to construct new responses to novel challenges.

While system constraint usually abets system performance, systems that
become too efficient do so by decreasing their (mutually exclusive)
flexibility and become progressively vulnerable to collapse.

The lesson for evolutionary theory is clear. Survival is not always a
min/max (fitt*est*) issue. It is about a balance between adaptation
and adaptability. Ecosystems do not attain maximum efficiency. To do
so would doom them.
 The balance also
puts the lie to a major maxim of economics, which is that nothing
should hinder the efficiency of the market. That's a recipe for "boom
and bust". 

Mark, I do disagree with your opinion that information cannot be
measured. The wider application of information theory extends beyond
communication and covers the information inherent in structure, or
what John Collier calls "enformation". Measurement is extremely
important there. Perhaps you are disquieted by the relative nature of
information measurements. Such relativity is inevitable. Information
can only be measured with respect to some (arbitrary) reference
distribution (which is also known in the wider realm of thermodynamics
as "the third law".)

Remember how Bateson pointed to the overwhelmingly positivist nature
of physics. Classical physics is deficient in its lack of recognition
of the apophatic. Information theory cures that.

Yes, information requires a material carrier. It also is intimately
affected by and requires nonmaterial apophasis.

Best wishes,
Bob

On 4/24/18, Burgin, Mark  wrote:
> Dear Colleagues,
>
> I would like to suggest the new topic for discussion
>
>Is information physical?
>
> My opinion is presented below:
>
> Why some people erroneously think that information is physical
>
> The main reason to think that information is physical is the strong
> belief of many people, especially, scientists that there is only
> physical reality, which is studied by science. At the same time, people
> encounter something that they call information.
>
> When people receive a letter, they comprehend that it is information
> because with the letter they receive information. The letter is
> physical, i.e., a physical object. As a result, people start thinking
> that information is physical. When people receive an e-mail, they
> comprehend that it is information because with the e-mail they receive
> information. The e-mail comes to the computer in the form of
> electromagnetic waves, which are physical. As a result, people start
> thinking even more that information is physical.
>
> However, letters, electromagnetic waves and actually all physical
> objects are only carriers or containers of information.
>
> To understand this better, let us consider a textbook. Is possible to
> say that this book is knowledge? Any reasonable person will tell that
> the 

Re: [Fis] New Year Lecture

2018-01-10 Thread Ulanowicz, Robert
Just a few short comments in response to Mark & John:

We definitely must reconsider the logic of biology! To start with, we must
abandon the Aristotelian prohibition of circular causality, as Alicia
Juarrero suggests. Life is all about recursion! Then there's the inherent
dialectical nature of living systems, as espoused by Lupasco and championed
by Joseph.

Although more a matter of perspective than logic, we need to focus on life
as configurations of processes, rather than objects moving according to
universal laws. I have even suggested that this shift entails an entire
revision of scientific metaphysics <
https://people.clas.ufl.edu/ulan/publications/philosophy/3rdwindow/>.

And there's the position of communication in regard to information theory.
Historically, of course, IT emerged out of communication theory. It has
grown, however, to envelop constraint in general, as Stan points out. One
can speak of the information inhering in a structure in abstraction from
any consideration of communication. John Collier calls this "enformation"
and it is quite amenable to treatment by IT. One can quantify the
information in a stable structure, or more importantly as regards life, in
a configuration of processes <
https://people.clas.ufl.edu/ulan/publications/ecosystems/ecolasc/>.

With regard to what is driving the adaptation that John cites among cells
and their environment, there is the centripetality that is a consequence of
autocatalytic configurations, an attribute that was not yet included in
Varela's autopoietic narrative (ibid.) Way back in 1960 Bertrand Russell
pointed to it as the "drive behind all of evolution", but few have taken up
his lead. For over 20 years I have been advocating the inclusion of this
phenomenon among the fundamental characteristics of life -- one that
provides directionality to living entities and the origin of selfhood and
striving. Terry is now working on a book describing an independent
treatment of the phenomenon.

Finally, there's the need to move beyond the confines of the positivist
thinking that constitutes most of physics and seriously to consider
heterogeneity and absence. As Elsasser indicated, heterogeneity pushes us
beyond the control of the universal laws  and the circumscribed logic of
physics . The universal
laws are not broken, but the combinatorics of heterogeneity implies that
they can only constrain, but not determine outcomes in a universe rife with
contingency . As to
reckoning absence, IT is the perfect vehicle to quantify entropic absence
that is so critical to sustainable life <
https://people.clas.ufl.edu/ulan/files/Reckon.pdf>.

In all, I would posit that, if we want to accomplish a true understanding
of living phenomena, we need to start thinking outside the current box --
WAY OUTSIDE!!

My best to all,
Bob

On Wed, Jan 10, 2018 at 5:08 AM, Mark Johnson  wrote:

> Dear John,
>
> Thank you very much for this - a great way to start the new year!
>
> I'd like to ask about "communication" - it's a word which is
> understood in many different ways, and in the context of cells, is
> hard to imagine.
>
> When you suggest that “the unicellular state delegates its progeny to
> interact with the environment as agents, collecting data to inform the
> recapitulating unicell of ecological changes that are occurring.
> Through the acquisition and filtering of epigenetic marks via meiosis,
> fertilization, and embryogenesis, even on into adulthood, where the
> endocrine system dictates the length and depth of the stages of the
> life cycle, now known to be under epigenetic control, the unicell
> remains in effective synchrony with environmental changes.” It seems
> that this is not communication of ‘signs’ in the Peircean sense
> supported by the biosemioticians (Hoffmeyer). But is it instead a
> recursive set of transductions, much in the spirit of Bateson’s
> insight that:
>
> “Formerly we thought of a hierarchy of taxa—individual, family line,
> subspecies, species, etc.—as units of survival. We now see a different
> hierarchy of units—gene-in-organism, organism-in environment,
> ecosystem, etc. Ecology, in the widest sense, turns out to be the
> study of the interaction and survival of ideas and programs (i.e.,
> differences, complexes of differences, etc.) in circuits.” (from his
> paper "Pathologies of Epistemology" in Steps to an Ecology of Mind)
>
> Recursive transduction like this is a common theme in cybernetics –
> it's in Ashby's "Design for a Brain", Pask's conversation theory, and
> in Beer’s Viable System Model, where “horizon scanning” (an
> anticipatory sub-system gathering data from the environment) is an
> important part of the metasystem which maintains viability of the
> organism (It’s worth noting that Maturana and Varela's autopoietic
> theory overlooks this).
>
> "Communication" would then be much more like 

Re: [Fis] some notes

2017-11-19 Thread Ulanowicz, Robert
Dear Loet,

Shannon "information" is indeed counter-intuitive, and we have John von
Neumann's joke to thank for that confusion.Shannon asked von Nuemann what
to name his formula. von Neumann told him to name it "entropy", because his
metric is "formally identical to von Boltzmann's probabilistic measure of
entropy, and because no one really knows what entropy is, so you'll be
ahead in any argument!" Whence the conflation of entropy with Shannon's
measure of diversity.

"Meaningful information" is calculated in Bayesian fashion by comparing the
*differences* between the apriori and aposteriori distributions (or sender
and receiver in the *subdiscipline* of communication). It is called the
"average mutual information", (AMI) which serves as a form of "proto
meaning" -- detractors of the Shannon approach notwithstanding. You, in
fact, have published numerous fine papers centered about AMI.

The residual between entropy and AMI is called the "conditional entropy".
This warns us to be careful concerning entropy and information: The
probabilistic entropy of both Boltzmann and Shannon doesn't quite
characterize true entropy. <
https://people.clas.ufl.edu/ulan/files/ECOCOMP2.pdf> It encompasses *both*
didactic constraint (AMI) and its absence (an apophasis).  (Entropy,
strictly speaking, is an apophasis, which is why we have such trouble
wrapping our minds around the concept!) Thermodynamic entropy, measured by
heat and temperature, comes closer to grasping the true nature of the
apophasis, but even there ambiguity remained, and it became necessary to
postulate the third law of thermodynamics. (Entropy at zero degrees Kelvin
is zero.) (N.b., physicists "define" entropy as the Boltzmann formula in a
vain effort to "sanitize" thermodynamics. The messy engineering roots of
thermodynamics have always been an irritant for physicists, going all the
way back to the time of Carnot.)

And so we need to be careful when approaching the concepts of information
and entropy. We need to think first before dismissing the Shannon approach
as having no relation to meaning (albeit in a primitive fashion), and we
need always to keep in mind that *both* concepts are always relative in
nature -- never absolute! 

My best to all,
Bob U.

On Sat, Nov 18, 2017 at 3:18 AM, Loet Leydesdorff 
wrote:

> Dear Terry and colleagues,
>
> I agree that one should not confuse communication with the substance of
> communication (e.g., life in bio-semiotics). It seems useful to me to
> distinguish between several concepts of "communication".
>
> 1. Shannon's (1948) definitions in "The Mathematical Theory of
> Communication". Information is communicated, but is yet meaningfree. These
> notions of information and communication are counter-intuitive (Weaver,
> 1949). However, they provide us with means for the measurement, such as
> bits of information. The meaning of the communication is provided by the
> system of reference (Theil, 1972); in other words, by the specification of
> "what is comunicated?" For example, if money is communicated
> (redistributed), the system of reference is a transaction system. If
> molecules are communicated, life can be generated (Maturana).
>
> 2. Information as "a difference which makes a difference" (Bateson, 1973;
> McKay, 1969). A difference can only make a difference for a receiving
> system that provides meaning to the system. In my opinion, one should in
> this case talk about "meaningful information" and "meaningful
> communication" as different from the Shannon-type information (based on
> probability distributions). In this case, we don't have a clear instrument
> for the measurement. For this reason, I have a preference for the
> definitions under 1.
>
> 3. Interhuman communication is of a different order because it involves
> intentionality and language. The discourses under 1. and 2. are interhuman
> communication systems. (One has to distinguish levels and should not impose
> our intuitive notion of communication on the processes under study.) In my
> opinion, interhuman communication involves both communication of
> information and possibilities of sharing meaning.
>
> The Shannon-type information shares with physics the notion of entropy.
> However, physical entropy is dimensioned (Joule/Kelvin; S = k(B) H),
> whereas probabilistic entropy is dimensionless (H). Classical physics, for
> example, is based on the communication of momenta and energy because these
> two quantities have to be conserved. In the 17th century, it was common to
> use the word "communication" in this context (Leibniz).
>
> Best,
> Loet
>
>
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis