Re: [Fis] What is "Agent"?

2017-10-16 Thread Robert E. Ulanowicz
Dear Krassimir,

In agreement or partial agreement with most responses, I see the kernel of
agency as autocatalysis, by virtue of the centripetality that dynamic
engenders.

Autocatalysis is a subset of feedbacks wherein each link in a loop
benefits the next member. It is easy to show that such action selects for
all perturbations that augment inputs to any member. That is, the loop
acts like a vortex to draw resources into the autocatalytic orbit. (See
p289 of , or better
yet, pp70 – 73 in
.)

Centripetality is a directional phenomenon that defines the self of any
living being. It is ubiquitous to *all* living entities, but is hardly
ever mentioned among the necessary attributes of life. Bertrand Russell
called it “chemical imperialism” and cited it as the drive behind *all of
evolution*. It is the generatrix of competition. No actor can compete at
any level without centripetality at work at the next level down.

Centripetality, the expression of agency, serves not only to change the
environment of the organism, but does so in a way that sustains and
imparts advantage to the self.

Intelligence does not seem to be a necessary attribute of such agency.

Greetings to all,
Bob

> Dear FIS Colleagues,
>
> After nice collaboration last weeks, a paper Called “Data versus
> Information” is prepared in very beginning draft variant and already is
> sent to authors for refining.
> Many thanks for fruitful work!
>
> What we have till now is the understanding that the information is some
> more than data.
> In other words:
>  d = r
>  i = r + e
> where:
>  d => data;
>  i => information;
>  r => reflection;
>  e => something Else, internal for the Agent (subject, interpreter,
> etc.).
>
> Simple question: What is “Agent”?
>
> When an entity became an Agent? What is important to qualify the entity as
> Agent or as an Intelligent Agent? What kind of agent is the cell? At the
> end - does information exist for Agents or only for Intelligent Agents?
>
> Thesis: Information exists only for the Intelligent Agents.
>
> Antithesis: Information exists at all levels of Agents.
>
> Friendly greetings
> Krassimir
>
>
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-13 Thread Robert E. Ulanowicz
Dear Mark,

Thank you for your interest in my FIS paper!


I didn't intend by it to infer that Shannon-class measures were the
ultimate tool for information science, only to argue against prematurely
rejecting that thrust entirely -- as so many do. By looking at Bayesian
forms of the Shannon measure we can address information per-se (and even a
form of proto-meaning)and achieve a measure of what is missing. This
latter advantage opens up another dimension to science. (The apophatic had
been implicitly addressed by thermodynamic entropy, which has hardly ever
been recognized as an apophasis. That's why entropy remains so confusing
to so many!)

The Achilles tendon of Shannon-like measures lies in the underlying
assumption of distinct categories with which to describe the
distributions. The boundaries between categories are often "fuzzy", and,
as you point out, they change with time and growth.

I have been told that mutual information(s) has been defined over fuzzy
sets, but I confess I haven't investigated the advantages of this
extension. As for changing numbers of categories, I note that mutual
information remains well-defined even when the numbers of categories in
the sets being compared are not the same. So I would encourage your
exploration with musical forms.

As to Ashby's metaphor of a hemostat as a machine, my personal preference
is to restrict mechanical analogs for living systems to only those that
are unavoidable. I feel the language of mechanics and mechanisms is
*vastly* overused in biology and draws our attention away from the true
nature of biotic systems.

Thank you for your challenging and astute questions!

Cheers,
Bob

> Dear Bob,
>
> In your Shannon Exonerata paper you have an example of three strings,
> their entropies and their mutual information. I very much admire this
> paper and particularly the critique  of Shannon and the emphasis on the
> apophatic, but some things puzzle me. If these are strings of a living
> thing, then we can assume that these strings grow over time. If sequences
> A,B and C are related, then the growth of one is dependent on the growth
> of the other. This process occurs in time. During the growth of the
> strings, even the determination of what is and is not surprising changes
> with the distinction between what is seen to be the same and what isn't.
>
>  I have begun to think that it's the relative entropy between growing
> things (whether biological measurements, lines of musical counterpoint,
> learning) that matters. Particularly as mutual information is a variety
> of relative entropy. There are dynamics in the interactions. A change in
> entropy for one string with no change in entropy in the others (melody
> and accompaniment) is distinct from everything changing at the same time
> (that's "death and transfiguration"!).
>
> Shannon's formula isn't good at measuring change in entropy. It's less
> good with changes in distinctions which occur at critical moments ("aha! A
> discovery!" Or "this is no longer surprising") The best that we might do,
> I've thought, is segment your strings over time and examine relative
> entropies. I've done this with music. Does anyone have any other
> techniques?
>
> On the apophatic, I can imagine a study of the dynamics of Ashby's
> homeostat where each unit produced one of your strings. The machine comes
> to its solution when the entropies of the dials are each 0 (redundancy 1)
> As the machine approaches its equilibrium, the constraint of each dial on
> every other can be explored by the relative entropies between the dials.
> If we wanted the machine to keep on searching and not settle, it's
> conceivable that you might add more dials into the mechanism as its
> relative entropy started to approach 0. What would this do? It would
> maintain a counterpoint in the relative entropies within the ensemble.
> Would adding the dial increase the apophasis? Or the entropy? Or the
> relative entropy?
>
> Best wishes,
>
> Mark


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-09 Thread Robert E. Ulanowicz

> A perspectival shift can help of the kind that Gregory Bateson once talked
> about. When we look at a hand, do we see five fingers or four spaces?
> Discourses are a bit like fingers, aren't they?

Mark,

The absence of the absent was a major theme of Bateson's, and he
criticized physics for virtually neglecting the missing.

Fortunately, IT is predicated on the missing, and quantitative information
is a double negative (the reverse of what is missing). This makes IT a
prime tool for broadening our scope on reality.
 &


Bob

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] If "data = information", why we need both concepts?

2017-10-07 Thread Robert E. Ulanowicz
Dear Michel,

I spent my career doing much the same thing with mutual information, which
in this case quantifies the degree of constraint among the species.

Encouraged by the suggestions of E.P. Odum, I hypothesized that ecosystems
would naturally increase in the product of their gross activities times
the mutual information among the network of interactions -- a product
(fashioned after the Gibbs/Helmholz free energies) that I called system
"ascendency".

After about two decades of measuring the ascendencies of diverse
ecosystems, the data were telling me that my hypothesis was wrong.
Ecosystems do not continually progress in increasing ascendency, but
rather achieve a balance between ascendency (a surrogate for efficiency)
and its complement, the system overhead (which mirrors reliability).
Furthermore, the quantitative nature of the balance is notably insensitive
to the type of ecosystem under study, averaging about 40% efficiency and
60% redundancy. (See Figure 7 on p1890 of
.)

Now, you might argue that constraint is not information and so these
results are not germane to our discussion, but I (and I think Stan) would
propose that constraint is actually the most generalized form of
information, and the Bayesian forms of the Shannon measure beautifully
parse the division between efficiency and reliability.

While I didn't set out to falsify my initial hypothesis, that is indeed
what eventually happened. Notice that it was accomplished in quantitative
fashion and without any recourse whatsoever to system dynamics. The
decades-long exercise demonstrates, I think, a phenomenological approach
to the science of life pursued in abstraction of (but not contradiction
to) the underlying physics and chemistry.

Je lis un peu francais et voudrais bien lire de votre travail sur
l'information mutuelle.

Cordialement,
Bob

> Dear colleagues
>
> Loet thinks that "Nobody of us provide an operative framework and a
> single (just one!) empirical  testable prevision able to assess
> "information"
>
> In my ecological work, I try  to know the relations between living
> organisms and their environment, and I use Brillouin's formula (and
> non-inferential statistics) to compute the "mutual information" between
> each species of plant or animal and  and each constraint of the
> environment. The testable prevision is, for example, the potential area
> of a species.
>
> The book where that method is explained is written in french, but I
> could translate this example in english if you think that it could be
> published.
>
> Cordialement. M. Godron
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Information: a metaphysical word; what is chemical information?

2017-04-14 Thread Robert E. Ulanowicz
Dear Jerry,

My apologies for taking so long to reply! I have been overwhelmed with
queries on email, civic and social obligations and responsibilities for
liturgical rubrics in my parish. I haven’t spent very much time online as
a result.

I sympathize entirely with your apprehensions, but I think they are
inapplicable here.

I share your concerns that homogeneous variables like mass, energy,
charge, etc. are inappropriate to heterogeneous situations. Gregory
Bateson made the distinction between the former, which he called “pleroma”
and the latter, which he characterized as “creatura”. He pointed out how
the former are insufficient to describe the latter. Later, Walter Elsasser
pointed out how the logic of the laws of physics, equivalent as they are
to operations on homogeneous sets, does not apply to heterogeneous
systems, especially biological ones.
<http://www.vordenker.de/elsasser/we_logic-biol.pdf>

You are certainly correct in pointing out that one needn’t be concerned
only with living systems, in that the transition from physics to chemistry
already crosses this divide.

The problem is that I do not see the divide as being as dichotomous as you
portray it.

You are probably aware of the realm of chemical thermodynamics, where the
effort has been made to incorporate attributes of heterogeneity into
variables that characterize the entire system. For example, the Gibbs (and
Helmholz) free energies are defined for chemical reaction systems. Changes
in the various tokens will contribute to changes in the whole system
quantity.

It is an attempt to marry the two domains by folding the heterogeneous
tokens into a pseudo-homogeneous system function. Of course this is
precisely what the Shannon formula does. The key to my assertion is that
the Shannon variable can be decomposed WITH RESPECT TO A SECOND, REFERENCE
DISTRIBUTION into two terms – one which quantifies the order (amount of
constraint) that the two orders exert one each other and a second that
quantifies the freedom that the two enjoy from each other. The second
term, called the “conditional entropy” in information theory is actually a
better homolog to physical entropy than the Shannon formula.

In applying this calculus to arbitrary networks, Rutledge et al. (J.
theor. Biol. 57:355) showed true genius by identifying the first
distribution with the distribution of inputs into the nodes and the second
as the distribution of outputs from the same nodes.  Hence the mutual
information (total effective mutual constraints) becomes a measure of the
internal order in the system and the second is a surrogate for its
entropy. The key to my assertion is that these two terms are precisely
complementary, so that if one is somehow indeterminate, the other must be
likewise.

Now, I confess that I have taken a major liberty in identifying
statistical entropy with physical entropy. But the third law has its
homolog in information theory in the result that statistical entropy is
always relative. Whence, the inherent structural information of the
network must likewise always remain relative.

I further confess that I have always inveighed against identifying
physical entropy with statistical entropy. They are, however, accurate
homologs of one another, and that is all that I am claiming.

I remark in passing that the mapping of physical elements into the
integers is decidedly homomorphic and not isomorphic. The number 12 refers
to the number of protons in the nucleus of a carbon atom, nothing more.
There are a variety of isotopes, ionic and radical forms that also map
into the same integer. Each has its own properties that would factor into
any physical measurement on a mixture of these varieties.

So, in conclusion, I would readily agree that a thermodynamics that is
strictly confined to pleroma cannot fully illumine attributes of a
heterogeneous system. But contemporary thermodynamics is not so
constrained. Neither is information theory, nor is chemistry without its
own hidden heterogeneities.

As regards information theory, statistical entropy is always relative,
which forces its complementary information to be likewise.

Cheers,
Bob

> List, Bob:
>
>> On Mar 27, 2017, at 10:37 AM, Robert E. Ulanowicz <u...@umces.edu>
>> wrote:
>>
>> First off, that information is always relative is the obverse of the
>> third
>> law of thermodynamics. It cannot be otherwise.
>> <http://people.clas.ufl.edu/ulan/files/FISPAP.pdf
>> <http://people.clas.ufl.edu/ulan/files/FISPAP.pdf>>
>
> First off?
>
> I fear that I am rather skeptical about this assertion for a simple
> structural reason that illuminates the scientific inadequacy of
> thermodynamics as source of scientific apperceptions.
>
> The notion of a general mathematical form of information residing within
> the Third Law of Thermodynamics is difficult for me to image because of
> the chemical table of el

Re: [Fis] Causation is transfer of information

2017-03-28 Thread Robert E. Ulanowicz
In order:

John,

I agree. For example, if one identifies information with constraint, the
notion of information as causation becomes tautologous. It also feeds into
the notion of "It from bit"!

Terry,

I agree, best to remain as catholic as possible in our conception of the
notion.

Otto:

Spot-on! Feedbacks among non-living components provided the cradle for the
early emergence and proliferation of information. (See p147ff in
.)

Cheers to all,
Bob U.

> Dear all,
> Just to comment on the discussion after Terrence's apt cautionary words...
>
> The various notions of information are partially a linguistic confusion,
> partially a relic of multiple conceptual histories colliding, and
> partially
> an ongoing negotiation (or even a war, to state it less creditably and
> with
> less civility), about the future of the term as a (more or less unified)
> scientific concept.
>
> To latch onto that negotiation, let me propose that an evolutionary
> approach to information can capture and explain some of that ambiguous
> multiplicity in terminology, by showing how pre-biotic natural processes
> developed feedback loops and material encoding techniques - which was a
> type of localised informational emergence - and how life, in developing
> cellular communication, DNA, sentience, memory, and selfhood, rarified
> this
> process further, producing informational processing such that had never
> existed before. Was it the same information? Or was it something new?
>
> Human consciousness and cultural semiosis are a yet higher level
> adaptation
> of information, and computer A.I. is something else entirely, for - at
> least for now - it lacks feelings and self-awareness and thus "meaning" in
> the human sense. But it computes, stores and processes. It might even
> develop suprasentience whose structure we cannot fathom based on our
> limited human perspective.  Is it still the same type of information? Or
> something different? Is evolution in quality (emergence) or only in
> quantity (continuous development)?
>
> I generally take the Peircean view that signification (informative
> relationality) evolves, and information, as an offshoot of that, is thus a
> multi-stage process - EVEN if it has a simple and predictable elemental
> substructure (composed of say, 1s and 0s, or quarks and bosons).
>
> Information might thus not only have a complex history of emergence, but
> also an unknown future, composed of various leaps in cosmic organization.
>
> In ignorant wonder, all the best,
>
> Otto Lehto,
>
> philosopher, political economist,
> PhD student at King's College London,
> webpage: www.ottolehto.com,
> cellphone: +358-407514748
>
> On Mar 28, 2017 23:24, "Terrence W. DEACON"  wrote:
>
>> Corrected typos (in case the intrinsic redundancy didn't compensate for
>> these minor corruptions of the text):
>>
>>  information-beqaring medium =  information-bearing medium
>>
>> appliction = application
>>
>>  conceptiont =  conception
>>
>> On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON
>> 
>> wrote:
>>
>>> Dear FIS colleagues,
>>>
>>> I agree with John Collier that we should not assume to restrict the
>>> concept of information to only one subset of its potential
>>> applications.
>>> But to work with this breadth of usage we need to recognize that
>>> 'information' can refer to intrinsic statistical properties of a
>>> physical
>>> medium, extrinsic referential properties of that medium (i.e. content),
>>> and
>>> the significance or use value of that content, depending on the
>>> context.  A
>>> problem arises when we demand that only one of these uses should be
>>> given
>>> legitimacy. As I have repeatedly suggested on this listserve, it will
>>> be a
>>> source of constant useless argument to make the assertion that someone
>>> is
>>> wrong in their understanding of information if they use it in one of
>>> these
>>> non-formal ways. But to fail to mark which conception of information is
>>> being considered, or worse, to use equivocal conceptions of the term in
>>> the
>>> same argument, will ultimately undermine our efforts to understand one
>>> another and develop a complete general theory of information.
>>>
>>> This nominalization of 'inform' has been in use for hundreds of years
>>> in
>>> legal and literary contexts, in all of these variant forms. But there
>>> has
>>> been a slowly increasing tendency to use it to refer to the
>>> information-beqaring medium itself, in substantial terms. This reached
>>> its
>>> greatest extreme with the restricted technical usage formalized by
>>> Claude
>>> Shannon. Remember, however, that this was only introduced a little over
>>> a
>>> half century ago. When one of his mentors (Hartley) initially
>>> introduced a
>>> logarithmic measure of signal capacity he called it 'intelligence' —
>>> as in
>>> the gathering of intelligence by a spy organization. So had 

Re: [Fis] Fw: A Curious Story

2017-01-21 Thread Robert E. Ulanowicz
Dear Joseph, Pedro & Otto,

Just my own 2 cents on a topic with which I have little familiarity. I
heartily agree with our dear departed friend Michael Conrad. We are indeed
looking at the underlying physics of the universe, however, I would
maintain (and I think that Joseph and Otto would probably agree), the
physics we see is not entirely subsumed under the conventional scientific
metaphysics. In fact, I wrote a book trying to articulate in systematic
fashion what I think that amended metaphysics looks like.


Greetings to all,
Bob U.

> Dear Pedro and All,
>
> Thanks to Pedro again for this thought-provoking theme. We are all in
> states of greater or lesser ignorance regarding it!
>
> Here is just, again, a thought about your quote of Conrad: "when we look
> at a biological system we are looking at the face of the underlying
> physics of the universe."
>
> I.M.H.O., this statement is true but only partially so. There are
> non-thermodynamic parts of the underlying physics of the universe that are
> not visible at the biological level of reality, and a coupling between
> them remains to be demonstrated. Quantum superposition and self-duality
> have analogies in macroscopic physics, but quantum non-locality and
> sub-quantum fluctuations do not.
>
> Of course, if you allow slightly altered laws of nature, many things may
> be possible as Smolin suggests. However, I suggest that the domain of
> interaction between actual and potential states in our everyday 'grown-up'
> world also has things to tell us, e.g., about information, that can be
> looked at more easily.
>
> Best wishes,
>
> Joseph


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] A Curious Story

2017-01-11 Thread Robert E. Ulanowicz
Dear Prof. Roessler:

My training in quantum physics lies over a half-century in the past, so I
cannot add or detract from the specifics of this issue without exposing my
ignorance. I can only respond as an engineer with a devotion to the field
of dimensional analysis.

I note that the Planck constant and the gravitational constant deal with
phenomena that are roughly some 43 orders of magnitude apart. Common
engineering practice holds that dimensionless constants that differ by
more than 5 orders of magnitude can be neglected in treating the problem a
hand. Either the phenomenon in question is so fast that it is always in
equilibrium with respect to more pertinent dynamics, or so slow that it
takes on the guise of a boundary constraint.

This is why I have always been skeptical of natural small black holes. It
seems to me that in order to include the two constants into a
dimensionless ratio of order one, one would have to combine them with
characteristic distance and mass parameters of very large magnitudes --
such as those of galactic or cosmic scale. Such combination might be
interposed artificially, of course, but I wouldn't expect the resultant
bhs to behave like galactic systems.

I know this sounds like ignorance or witchcraft to trained physicists, but
engineers often have to evaluate and deal with systems for which the
governing dynamics cannot be specified -- and dimensional analysis usually
provides quite a reliable gauge.

Respectfully,
Bob Ulanowicz

> I like this response from Lou,Otto
>
>   From: Louis H Kauffman 
>  To: Pedro C. Marijuan 
> Cc: fis 
>  Sent: Tuesday, January 10, 2017 6:09 PM
>  Subject: Re: [Fis] A Curious Story
>
> Dear Folks,It is very important to not be hasty and assume that the
> warning Professor Rossler made is to be taken seriously.It is relatively
> easy to check if a mathematical reasoning is true or false.It is much more
> difficult to see if a piece of mathematics is correctly alligned to
> physical prediction.Note also that a reaction such as "THIS STORY IS A
> GOOD REASON FOR SHUTTING DOWN CERN PERMANENTLY AND SAVING A LOT OF LARGELY
> WASTED MONEY.”.Is not in the form of scientific rational discussion, but
> rather in the form of taking a given conclusion for granted and using it
> to support another opinion that is just that - an opinion. 
> By concatenating such behaviors we arrive at the present political state
> of the world.
> This is why, in my letter, I have asked for an honest discussion of the
> possible validity of Professor Rossler’s arguments.
> At this point I run out of commentary room for this week and I shall read
> and look forward to making further comments next week.Best,Lou Kauffman
>
>
> On Jan 9, 2017, at 7:17 AM, Pedro C. Marijuan 
> wrote:
>
> From Alex Hankey Mensaje reenviado 
> | Asunto:  | Re: [Fis] A Curious Story |
> | Fecha:  | Sun, 8 Jan 2017 19:55:55 +0530 |
> | De:  | Alex Hankey  |
> | Para:  | PEDRO CLEMENTE MARIJUAN FERNANDEZ  |
>
>
>
>  THIS STORY IS A GOOD REASON FOR SHUTTING DOWN CERN PERMANENTLY AND SAVING
> A LOT OF LARGELY WASTED MONEY.
>  On 5 January 2017 at 16:36, PEDRO CLEMENTE MARIJUAN FERNANDEZ
>  wrote:
>
>   Dear FISers,
>   Herewith the Lecture inaugurating our 2017 sessions. I really hope that
> this Curious Story is just that, a curiosity. But in science we should
> not look for hopes but for arguments and counter-arguments...
>   Best wishes to All and exciting times for the New Year! --Pedro
>
>
>De: Otto E. Rossler [oeros...@yahoo.com]
>  Enviado el: miércoles, 04 de enero de 2017 17:51
>  Para: PEDRO CLEMENTE MARIJUAN FERNANDEZ
>  Asunto: NY session
> --
>   A Curious Story   Otto E. Rossler, University of Tübingen, Germany
>
>   Maybe I am the only one who finds it curious. Which fact would then make
> it even more curious for me. It goes  like this: Someone says “I can
> save your house from a time bomb planted into the basement” and you
> respond by saying “I don’t care.” This curious story is taken from
> the Buddhist bible.     It of course depends on who is offering to
> help. It could be a lunatic person claiming that he alone can save the
> planet from a time-bomb about to be planted into it. In that case, there
> would be no reason to worry. On the other hand, it could also be that
> you, the manager, are a bit high at the moment so that you  don't fully
> appreciate the offer made to you. How serious is my offer herewith made
> to you today?   I only say that for eight years' time already, there
> exists no counter-proof in the literature to my at  first highly
> publicized proof of danger. I was able to demonstrate that the miniature
> black holes officially attempted to be produced at CERN do possess two
> radically new properties:
>
>
>- they 

Re: [Fis] _ Reply to Annette (A Priori Modeling)

2016-06-22 Thread Robert E. Ulanowicz
I agree with Stan.

The Shannon formula measures "capacity" for information, *not* information
itself. Consider the "snow" pattern on a TV without a signal. Its Shannon
measure is much higher than when a picture appears onscreen, yet we know
that the snow pattern carries no information.

We should begin with entropy, which is the *lack* of constraint. A
system's entropy is a measure of its degrees of freedom to move around.
Please note that entropy is an apophasis (something that does *not*
exist). That's what makes entropy such a difficult concept to grasp.

In contrast, information is present in all forms of constraint, something
that is palpable (apodictic). In communication theory such constraints are
evident in the associations between characters and signals. But constraint
exists beyond the narrow realm of communication, so that the information
in any structure, static or dynamic, can be quantified using the Shannon
calculus.  Thus, the
concept of information *transcends* the realm of communication.

So what about the Shannon measure? The distribution used to compute the
Shannon measure can be compared with any reference measure and split into
two components. One component, called the average mutual information,
reveals the amount of constraint between the two distributions, whereas
the second, called the conditional entropy, gauges the freedom that each
distribution has with respect to the other. The two terms sum exactly to
the Shannon measure.

Actually the term "conditional entropy" is redundant, because entropy can
never be calculated without reference to another state. This is called the
"third law of thermodynamics" and it applies to statistical measures just
as much as to physical thermodynamic measures.

Notice, however, that if the calculation of entropy is conditional, then
the measure of information is likewise conditional  (because they sum to
yield the Shannon measure). This conditional nature of information is an
ambiguity that leads to much of our confusion about the meaning of
information. (It keeps FIS discussions lively! :)

The Shannon measure and its decomposed components can all be readily
extended to multiple dimensions and applied to a host of truly complex
events and structures.

Most discussion remains focused on the Shannon measure in abstraction from
all else, which makes the index appear almost meaningless and of limited
utility. The (Bayesian) decomposition of the Shannon formula, however, is
quite rich in what it can reveal, even going so far as to apprehend
proto-meaning. 

The best to all,
Bob

> Entropy
>
> Regarding:
>> So I see it that you confirm to Shannon´s interpretation of entropy as
> actually being information <
> Well, in essence we may agree, but I would call this an unfortunate choice
> of words. “Information," I think, has come to mean so many things to so
> many people that it is *nearly* a useless term. Even though I use this
> term
> myself, I try to minimize its use. I would say that I agree with
> Shannon’s
> view of signal entropy as a *type* of information – and then extend that
> concept using type theory, to include “meaningful” roles. Only when
> taken
> as a whole does “information” exist, within my framing.
>
> S: It has been shocking to me that many info-tech persons use the word
> 'information' when what they mean is Shannon's 'information carrying
> capacity' or the word 'entropy' when they mean Shannon's 'informational
> entropy', referring to variety.
>
> STAN


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fwd: Re: Cancer Cure? (Plamen S.)

2016-06-02 Thread Robert E. Ulanowicz
>  Mensaje reenviado 
> Asunto:   Re: [Fis] Cancer Cure?
> Fecha:Tue, 31 May 2016 19:54:05 +0200
> De:   Dr. Plamen L. Simeonov 
> Para: Robert Ulanowicz 
> CC:   Pedro C. Marijuan 
> Dear Bob and All,
> it is a compliment for me to read your notes on the subject. You don't
need to excuse. It is indeed a complex world of relations. And tt is
good that you rmentioned all this again from your perspective. We do not
know how many have entered the discussion later. Reiterations and
questions are always good and welcome. Well, I was expecting a vigorous
discussion on this subject which approaches its end now. But it is still
better to have one feedback rather than writing all this on paper of my
own without knowing what the reviewer or the reader would say at the
end. I still hope to hear a few more voices on that. We could take on
some of the other two major groups of diseases mentioned in the opening
session.

Dear Plamen and Pedro,

Thank you for your kind words. I hope I am not going over my weekly quota
by answering, but I will remain quiet for a while after this.

> Bob, I am glad that you mentioned quantum logic. Do you think
> we can try using it to express the emergent state of a disease (in
combination or not with heterogeneity afine SOC) We are not limited to
cancer only.

I am no expert in any kind of logic, but am acutely aware that our world
requires more than the standard Aristotelian sort. (Just ask Joe Brenner.)
As for quantum behavior at macroscopic scales, I remain quite skeptical
that it is the same phenomenon that operates at atomic and subatomic
scales. On the other hand, I am quite open to quantum-like behaviors at
macroscales. I think a few investigators are aware of this ontological
difference and are treating the subject in the right manner --
dimensionally speaking.

For example, Dr. Diederik Aerts  of the
Vrije Universiteit Brussel  was able to show that quantum-like behavior
can transpire in macrosystems in total abstraction of the Planck distance
and the quantum vacuum

. His associate, Dr. Sandro Sozzo
, has applied Aerts' ideas to
ecology.

> In fact I am also interested to know your opinion on such
> aspects as self-similarity or symmetry/asymetry during the development
of a disease throughout all transition phases. These issues have been
often discussed in a different context at FIS.

I acknowledge self-similarity in physical systems and imagine some of that
behavior bleeds over into biology (as for example, with Aert's work that I
just mentioned). I don't see self-similarity as a major player in biology,
however. My familiarity with dimensional analysis tells me that one should
always look for qualitatively different behavior at different scales and
that asymmetry plays a greater role in biology than it does in physics.

> How about the
> biosemiotics aspect which I mentioned earlier?

I think biosemiotics provides a viable pathway to understanding living
systems.  I wasn't
always a fan of it, thinking that its narrative  was too anthropomorphic.
Jesper Hofmeyer, however, showed me some convincing examples that were
decidedly not anthropomorphic.

> Tell me what do you think could be a promising approach to tackle a
tough health problem.

As you possibly may know, my hobby horse has been quantified flow
networks, because they force one to think in holisitc terms. (Not that
holism is all there is, but it is usually a player in any living system.)
In my first book, Growth and Development (p160) and later in my second
book, Ecology, the Ascendent Perspective (pp149-151), I mentioned how
medicine needed to consider more than just oncogenes in their approach to
cancer therapy. I suggested that more attention needed to be paid to whole
system behaviors, especially that of the immune system. Well, of course
immunotherapy has now become the most promising therapy against cancer
(but unfortunately not because of my remarks :).

> Is there anybody out there? :-)
> All the best,
> Plamen

Some of us are listening! :)

The best,
Bob



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Cancer Cure?

2016-05-30 Thread Robert E. Ulanowicz

> And yet, SOC is only one of the theoretical options that can resonate
> together. What I am interested to know is: do yo think that SOC is a good
> point to start from when moving from physics to biology?

Dear Plamen:

Most renditions of SOC with which I am familiar involve single homogeneous
variables. I am of the opinion that physics is preoccupied with
homogeneity, whereas biology is all about heterogeneity. Therefore, I am
skeptical whether descriptions in terms of homogeneous variables (e.g.,
matter, energy, charge, mass) can ever be sufficient descriptors of
biological systems. Mind you, they may still be true (e.g., Bejan's
constructual law), but because they do not explicitly embody
heterogeneity, they will always be inadequate long-term descriptors of
living systems.

The common assumption has been that one can advance from homogeneous
variables to heterogeneous systems via the formulation of intricate
boundary-value statements connecting the many dimensions, but this is
usually impossible for both epistemic and ontological reasons. (One is
unable because of combinatorics to predicate the full link-up conditions
[epistemic], and the underlying many equations possess insufficient
stability to track complex systems [ontic].)

Somehow, explicit account needs be taken of system heterogeneity, such as
is done with some network metrics. The world of complexity is one of
*massive* heterogeneity. Physics, the study of the homogeneous, can't cut
it alone. (As Stu Kauffman puts it, we have reached the end of the "Era of
Physics". Not that physics won't still advance, but that not every event
and behavior in the complex world needs to be referred back to it.)

My personal opinion, of course.

Best wishes,
Bob


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Clarifying Posting

2016-05-06 Thread Robert E. Ulanowicz
> Pedro -- In short, how might phenomenology relate to science?  There is
> one
> approach - to physiology - that was taken by the British physiologist,
> John
> B. Haldane.  He did ALL his experiments upon himself.
>
> STAN

Dear Pedro,

Most of the discussion has centered about phenomenology in the sense of
Husserl. The topic is broader, however, and remains the foundation of the
engineering philosophy that has guided my career.

I have long advocated a phenomenological approach to biology as the only
way forward. I have devoted years to the phenomenological study of
ecosystems trophic exchange networks and have shown how hypothesis
falsification can be possible in abstraction of eliciting causes
.
I have gone so far as to propose an alternative metaphysics to
conventional mechanical/reductionist theory that followed from
phenomenological premises.


So I would submit that phenomenology is alive and well as a practical and
even quantitative tool in science. It's just that, as an engineer, I find
Husserl tough going. :)

Warm regards,
Bob

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] The next round on physics and phenomenology

2016-05-02 Thread Robert E. Ulanowicz
Dear Alex,

I have considerable sympathy with the phenomenological backbone of your
argument. I would caution, however, about relying on quantum theory (a la
Planck) as a literal support of it.

I was trained as an engineer to place great emphasis on dimensional
considerations, specifically on the Buckingham-Pi theorem
. Engineers
reckon the magnitude of various phenomena according to dimensionless
ratios. As a rule of thumb, if a dimensionless ratio is either smaller
than 10**(-5) or larger than 10**5, the two phenomena being compared can
usually be considered dynamically independent.

Now Planck’s constant and the gravitational constant differ by 40 or so
orders of magnitude, and so I remain extremely skeptical of any attempts
to co-join the two. Back in the late 90’s I wrote (somewhere?) that
Hawking’s effort to marry the two were futile. He gave up the quest some
10 years later (obviously not in response to my skepticism! :).

I have colleagues in ecology, who have tried to portray ecosystems as
“coherence domains” due to the same quantum phenomena as give rise to
coherence structures among water molecules. Once again, I remain highly
skeptical (macroscopic entanglement arguments notwithstanding).

That having been said, I do think that quantum-*like* behavior does exist
at macroscopic scales, and that what you have been describing likely is an
example of same. The case with water molecules appears to be that minute
phasings in the quantum vacuum can travel between molecules faster than
the speed of light, at which the inter-molecular forces travel between the
molecules, thereby serving as a cue to maintain coherence.

In ecosystems, information can travel via light or other physical means
faster than the relationships among participating species interact, and so
coherence could be maintained via that route. In the brain,
electromagnetic waves that accompany electron and proton movements can
travel between neurons faster than synaptic signals (which take ca.
one-tenth of a second). Whence, I see a significant possibility that
consciousness represents a neuronal “coherence domain” quite in
abstraction from the subatomic quantum realm.

The notion of consciousness as a coherence domain has some attractiveness
when one notes that the subjective feeling of consciousness is one of
“awareness of everything at once”.

And so I would conclude that I think you are pursuing a worthwhile
hypothesis, but I would encourage you to think outside the envelope of
Planckian phenomena. As an example of how quantum homologs might be
treated at macroscopic scales, I would recommend the work of Dr. Diederik
Aerts


 at the Vrije Universiteit Brussel, and the ecological writings of his
associate Dr. Sandro Sozzo
.


I would urge other participants on FIS also to cast a critical eye upon
the efforts of many physicists to totalize cosmological behavior in terms
of quantum theory. The form may be universal, but IMHO the actual
phenomena are more likely particular to the scale of observation.

Peace,
Bob U.


> Dear Plamen,
>
> Thank you for the encouragement in the spirit of 'Fare Thee Well',
> rather than 'Adieu, Dear Friend, A..', I suspect.
>
> I am attaching my presentation with the qualification that:
>
> The first half of the presentation explicitly constructs a new information
> theory applying at the apex of biological control systems, showing how it
> conforms to properties of experience postulated by Kant, Husserl, Chalmers
> and others; the second half applies the information structure to
> human-animal mind-to-mind communications from recent decades.
>
> I hope that everyone will find this novel approach pertinent.
> All good wishes,
>
> Alex Hankey
>
>
> On 23 April 2016 at 13:45, Dr. Plamen L. Simeonov <
> plamen.l.simeo...@gmail.com> wrote:
>
>>
>>
>> Dear Pedro, Alex and Colleagues,
>>
>> thank you for this introduction of the next round on physics and
>> phenomenology with Alex' challenging theory. I’d like to share with
>> you a
>> curious blog by Phillip Ball which a friend dropped me earlier this
>> morning:
>> http://nautil.us/issue/35/boundaries/why-physics-is-not-a-discipline.
>>
>> Farewell, Alex!
>>
>> Plamen
>>
>>
>> ___
>> Fis mailing list
>> Fis@listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>>
>
>
> --
> Alex Hankey M.A. (Cantab.) PhD (M.I.T.)
> Distinguished Professor of Yoga and Physical Science,
> SVYASA, Eknath Bhavan, 19 Gavipuram Circle
> Bangalore 560019, Karnataka, India
> Mobile (Intn'l): +44 7710 534195
> Mobile (India) +91 900 800 8789
> 
>
> 2015 JPBMB Special Issue on Integral Biomathics: Life Sciences,
> Mathematics
> and Phenomenological Philosophy
> 

[Fis] _ Re: _ In defense of quantum mechanics

2016-04-04 Thread Robert E. Ulanowicz
Ladies & Gentlemen:

To conserve on my postings, I would like to consolidate three comments:

The first is an addendum by Dr. Ed Dellian, historian of science regarding
the linear vs. quadratic forms of energy in QM. I append them below.

Secondly, I note Mark Johnson’s remarks:

"More deeply, Bateson’s highlighting of the difference between the way
we think and the way nature works is important. How can a concept of
information help us to think in tune with nature, rather than against
it?"

If we accept that the way we think is fundamentally different from the
way nature works, how might a concept of information avoid
exacerbating the pathologies of human existence? Wouldn’t it just turn
us into information bible-bashers hawking our ideas in online forums
(because universities are no longer interested in them!)? Would new
metrics help? Or would that simply create new scarcity in the form of
a technocratic elite? Or maybe we’re barking up the wrong tree. Maybe
it’s not “information” at all (whatever that is) – or maybe it’s “not
information”.

I like ‘not information’ as the study of the constraints within which
our crazy thinking takes place because it continually draws us back to
what isn't thought. “

Mark, I think the greatest contribution IT can make to our view on nature
the ability it affords us to consider and even quantify the effects of
apophasis (that which does not exist). Recall that information is defined
as a double negative, so that the starting point is the *lack* of
constraint (an apophasis). Bateson pointed out how almost all of science
is positivist in viewpoint, but how often the absence of something is what
is most important in affecting results. Information theory allows us to
view nature “with both eyes open” to perceive the fundamental dialectic
between order formation and entropic decay.


Thirdly, I quote Soeren:

"the concept of experience and meaning does not exist in the
vocabulary of the theoretical framework of natural sciences"

I would agree with the statement from the aspect of pragmatism – surely we
will never fully encapsulate all that is associated with subjective human
“meaning”.

I would disagree with the statement in the absolute sense, however,
because I believe the rudiments of meaning are indeed quantifiable. Take,
for example, the correspondences between the protein surfaces of an
invasive microbe and an antibody, where a lock-and-key relationship can be
described and quantified. To the antibody, this correspondence captures
the entire meaning of the microbe to the antibody’s existence.
 Somewhere along the way
from an antibody to the human being our ability to quantify meaning
necessarily breaks down, but I don’t think that meaning can be proscribed
from information theory *absolutely*.

Now here are Ed’s remarks:

   
Dear Bob,

the subject "linear versus squared concept of energy" is so important
that I want to add to my former comments the "story behind the story".

   As I have already said it began with Leibniz's 1686 short paper
"Brevis demonstratio erroris memorabilis Cartesii et aliorum", notably
published just one year before Newton's Principia. Leibniz in his paper
argued against the "measure of force" of a material body's motion, as it
was used by his contemporaries in the context of the Cartesian
philosophy of the time, i. e. the measure "matter times velocity", in
modern notation mv. This concept, already used by Galileo, was confirmed
as the true "quantity of uniform straightline motion" of a body, as soon
as in the years 1669-1671 the most famous scientists of the time, John
Wallis, Christiaan Huygens, and Christopher Wren, by order of the Royal
Society, had independently of each other investigated the case. Based on
collision experiments, they all corroborated the truth of the said
measure; accordingly it has survived until today under the name of
"momentum p".

  Now, if the quantity of uniform-straightline motion is correctly
measured by the product mv, what is the measurable quantity of the
"force" that causes such a motion? Modern science denies that there
exists such a cause; uniform straightline motion is said to result from
the "inertia of matter" alone, which inertia is seen not as some
measurable /quantity/, but as an intrinsic /quality/ of matter. However,
in the middle of the 17th century scientists indeed measured not only
the momentum p = mv of uniform-straightline motion, but also the "force"
or "cause" of that motion - and correctly so: Even though some believed
in a geometric /proportionality /of cause and effect, while some others
took cause and effect as /equivalents/, in both cases the quantity of
active motion-generating force was to be measured through the quantity
of the generated effect, that is, through the quantity of motion mv.

  This was the situation 

[Fis] _ Re: _ Re: _ Re: _ Re: On mathematical theories and models in biology

2016-03-29 Thread Robert E. Ulanowicz
Dear Guy,

Please allow me to respond to your invitation to Terry with my two cents.

My triad for supporting the dynamics of life is a bit different. I see the
three essential fundamentals as:

1. Aleatoricism

2. Feedback

3. Memory

Just to briefly elaborate on each:

1. I use aleatoricism to avoid the baggage associated with the term
"chance", which most immediately associate with "blind" chance. The
aleatoric spans the spectrum from unique events to blind chance to
conditional chance to propensities to just short of determinism.

2. More specifically (and in parallel with autopoesis) I focus on
autocatalytic feedback, which exhibits the property of "centripetality".
Centripetality appears on almost no one's list of properties of life,
despite its ubiquity in association with living systems.

3. Memory (and information) likely inhered in stable configurations of
processes (metabolism) well before the advent of molecular encoding. Terry
speaks to this point in Biological Theory 1(2):136-49.

My fundamentals do not include reproduction, because I see reproduction as
corollary to 2 & 3.

I propose a full metaphysics for life predicated on these three
assumptions.


Looking forward to what others see as fundamental.

Peace,
Bob


> I personally consider metabolism to be at the core of what constitutes
> ‘life’, so the notion of autopoeisis is very attractive to me.  It is
> also possible that the richness of life as we know it depends on having
> metabolisms (activity), genomes (memory), and reproduction combined.  The
> reductionistic approach to singling out one of these three pillars of life
> as its essence may be futile.  However, I want to point out that the most
> reduced version of ‘life’ I have seen was proposed by Terry Deacon in
> the concept he calls “autocells”.  Terry has made great contributions
> to FIS dealing with related topics, and I hope he will chime in here to
> describe his minimalist form of life, which is not cellular, does not have
> any metabolism or genetically encoded memory.  Autocells do, however,
> reproduce.
>
> Regards,
>
> Guy


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] _ Re: _ Closing lecture

2016-02-02 Thread Robert E. Ulanowicz
Dear Howard,

Thank you for leading a very interesting discussion! Beyond my earlier
comments, I just wanted to add what I believe is a minority opinion among
the FIS group.

I believe that information possesses both epistemic and ontic features.
Many opinions have been expressed concerning the former, and they are all
well-considered. I, however, wish to put in a word concerning the latter:

>From an epistemic viewpoint, information can be considered in abstraction
from any particular material manifestation. It is possible, however, to
regard information as inhering in any physical structure or configuration
of processes *without* any reference to communication theory, e.g.,
sender, receiver, coding, alphabet, etc. For example, the structure of a
network of processes possesses a *measurable* degree of *self-referenced*
information , which
can be calculated without any connections to communication. The mutual
information inhering in such a configuration is a measure of the
constraints extant in the structure (Stan Salthe) and has been termed
"enformation" by John Collier
.

That having been said, Bob Logan is correct, any measurement of said
information/enformation is perforce relative according to the whims of the
one performing the measurement. This is actually a complementary
consequence of the Third Law of thermodynamics.


Most look upon enformation as an adumbrated form of what they see as the
full concept of information, but it has proved a rather useful tool in
evaluating structures.

Again, thanks for the Ooomph you have imparted to us! :)

The best,
Bob U.

>
>
> First, a few responses.  I agree with Hans von Baeyer.  Pedro’s kindness
> is
> magic.
> I agree with Gyorgy Darvas that  quarks communicate.
> I also agree with Jerry  Chandler.  Brute force is not the  major mover of
> history.  Values and  virtues count.  A lot.  In fact, a culture organizes
> itself by  calling one way of doing things evil—brute force—and
> another way
> of doing things  a value  and a virtue.  Our way is the value and the
> virtue.  The ways of others are  brute force and evil.  We see
> cooperation  and
> warmth among  us.  But only enmity  and destruction among them.
> The  brute force is not  within groups, where values, virtues, and
> compassion  prevail.  It’s  between groups.  It’s in the pecking order
> battles
> between groups.
> Which means, in answer to Marcus  Abundis, yes, groups struggle for
> position in inter-group hierarchies like  chickens in a barnyard.  For
> example,
> America and China are vying right now for top position in the barnyard  of
> nations.  Russia’s in that  battle, too.  On a lower level, so  are
> Saudi
> Arabia and Iran, whose proxy war in Syria for pecking order dominance  has
> cost a
> quarter of a million lives.  That’s brute force.  Between  groups whose
> citizens are often lovely and loving to each other.  Whose citizens are
> proud
> of their values  and virtues.
> Now for a final  statement.
> Information exists in a  context.  That’s not at all  surprising.
> Information is all  about context.  As the writings of  Guenther Witzany
> hint.  And
> as  Ludwig Wittgenstein also suggested.  Information is relational.
> Information does not exist in a vacuum.  It connects participants.  And it
> makes
> things happen.  When it’s not connecting participants,  it’s not
> information
> FIS gets fired up to a high  energy level when discussing the definition
> of
> information and its relationship  to Shannon’s entropic information
> equation.  Alas, these discussions tend  to remove the context.  And
> context is
> what gives information  its indispensable ingredient, meaning.
> There are two basic approaches  in science:
> ·the abstract mathematical;
> ·and the observational empirical.
> Mathematical abstractionists  dwell on definitions and equations.
> Empirical observers gather facts.  Darwin was an observational empiricist.
> I’d like
> to see more of Darwin’s  kind of science in the world of information
> theory.
> One of Darwin’s most important  contributions was not the concept of
> natural selection.  It was an approach that Darwin got from  Kant and from
> his
> grandfather Erasmus.  That approach?  Lay out the  history of the cosmos
> on a
> timeline and piece together its story.  In chronological order.  Piece
> together the saga of how this  cosmos has created itself.  Including the
> self-motivated, self-creation of life.
> Communication plays a vital role  in this story.  It appears in the  first
> 10(-32) of a second of the cosmos’ existence, when quarks communicated
> using attraction and repulsion cues.  OK, it’s not quite right to call
> the cues
> attraction and repulsion  cues.  When two quarks sized each  other up,
> they
> interpreted 

Re: [Fis] _ RE: _ Re: Cho 2016 The social life of quarks

2016-01-21 Thread Robert E. Ulanowicz
Just a few words to follow on Pedro's concerning Howard's question:

>From our perspective all quarks are completely indistinguishable and
homogeneous, so the practical answer to Howard's question is "No, quarks
cannot communicate --period!"

It is possible, however, to imagine that quarks, being in large measure
wave packets, would at any instant be different from one another. One can
imagine multiple wave forms, dynamically changing with time. The
particular phasing between two quarks in the quantum vacuum could take on
any number of possibilities, and which possibility pertains at the time of
encounter would inform what kind of boson might result. Then it becomes
possible to speak of communication between them. It's just that we are
unable to access that level of interaction.

Cheers to all,
Bob U.

> Dear FIS Colleagues,
>
> Thanks to Jerry and Koichiro for their insightful and deep comments.
> Nevertheless the question from Howard was very clear and direct and I
> wonder whether we have responded that way --as usual, the simplest
> becomes the most difficult. I will try here.
>
> There is no "real" communication between quarks as they merely follow
> physical law--the state of the system is altered by some input according
> to boundary conditions and to the state own variables and parameters
> that dictate the way Law(s) have to intervene. The outcome may be
> probabilistic, but it is inexorably determined.
>
> There is real communication between cells, people, organizations... as
> the input is sensed (or disregarded) and judged according to boundary
> conditions and to the accumulated experiential information content of
> the entity. The outcome is adaptive: aiming at the
> self-production/self-propagation of the entity.
>
> In sum, the former is blind, while the second is oriented and made
> meaning-ful by the life cycle of the entity.
>
> Well, if we separate communication from the phenomenon of life, from its
> intertwining with the life cycle of the entity, then everything goes...
> and yes, quarks communicate, as well as billiard balls, stones, cells,
> etc. Directly we provide further anchor to the mechanistic way of
> thinking.
>
> best regards--Pedro
>
>
>
> Koichiro Matsuno escribió:
>>
>> At 2:43 AM 01/19/2016, Jerry wrote:
>>
>> In order for symbolic chemical communication to occur, the language
>> must go far beyond such simplistic notions of a primary interaction
>> among forces, such as centripetal orbits or even the four basic forces.
>>
>> The quark physicist is quirky in confining a set of quarks,
>> including possibly tetra- or even penta-, within a closed bag with use
>> of a virtual exchange of matter called gluons. This bag is
>> methodologically tightly-cohesive because of the virtuality of the
>> things to be exchanged exclusively in a closed manner. In contrast,
>> the real exchange of matter underlying the actual instantiation of
>> cohesion, which concerns the information phenomenologist facing
>> chemistry and biology in a serious manner, is about something
>> referring to something else in the actual and is thus open-ended.
>> Jerry, you seem calling our attention to the actual cohesion acting in
>> the empirical world which the physicist has failed in coping with, so
>> far.
>>
>>Koichiro
>>
>> *From:*Fis [mailto:fis-boun...@listas.unizar.es] *On Behalf Of *Jerry
>> LR Chandler
>> *Sent:* Tuesday, January 19, 2016 2:43 AM
>> *To:* fis 
>> *Subject:* [Fis] _ Re: Cho 2016 The social life of quarks
>>
>> Koichiro, Bob U., Pedro:
>>
>> Recent posts here illustrate the fundamental discord between modes of
>> human communication.  Pedro’s last post neatly addresses the immediate
>> issue.
>>
>>  But, the basic issue goes far, far deeper.
>>
>> The challenge of communicating our meanings is not restricted to just
>> scientific meaning vs. historical meaning.  Nor, communication between
>> the general community and, say, the music (operatic and ballad)
>> communities.
>>
>> Nor, is it merely a matter of definition of terms and re-defining
>> terms as “metaphor”in another discipline.
>>
>> Pedro’s post aims toward the deeper issues, issues that are fairly
>> known and understood in the symbolic  logic and chemical communities.
>>  In the chemical community, the understanding is at the level of
>> intuition because ordinary usage within the discipline requires an
>> intuitive understanding of the way symbolic usage manifests itself in
>> different disciplines.
>>
>> (For a detailed description of these issues, see, The Primary Logic,
>> Instruments for a dialogue between the two Cultures. M. Malatesta,
>> Gracewings, Fowler Wright Books, 1997.)
>>
>> The Polish Logician, A. Tarski, recognized the separation of meanings
>> and definitions requires the usage of METALANGUAGES.  For example,
>> ordinary public language is necessary for expression of meaning of
>> mathematical symbolic logic.  But, from the basic mathematical
>> language, once it grounded 

Re: [Fis] January Lecture--Information and the Forces of History

2016-01-06 Thread Robert E. Ulanowicz
Dear Howard and Pedro,

Please allow me to comment on the complementary visions of the Lucifer vs.
the "angelic" scenarios.

Anyone familiar with my work knows that I see configurations of mutually
beneficial processes as the driver behind all of evolution. The problem is
that this dynamic is, like much of nature, normatively ambiguous.

Mutually beneficial configurations exhibit a "centripetality", or the
tendency to bring ever more resources into their own orbits. This is a
universal, although much neglected, but necessary attribute of all living
systems.

So one is able to view this dynamic in either its angelic or Luciferian
manifestations:

At the angelic extreme, Giovanni di Fidenza (a.k.a., Bonaventure) saw the
infinite love shared among the persons of the Trinity as the beginning of
all creation, drawing all of creation eventually towards Godself.

The growth-inducing aspect of mutual beneficence was implicit in Darwin's
description of the counterplay between growth and elimination. The growth
side of the interaction has subsequently been minimized, and current
evolutionary theory emphasizes elimination. 

Of course, induced growth situated within a finite context eventually
leads to competition and often to elimination -- the Luciferian side of
the same phenomenon. Centripetality also elides into manifestation of
"self", or selfishness. (On the human scale, Daryl Domning speaks, not of
Original Sin, but "Original Selfishness". :) Induced competition and
selfishness then combine to yield Howard's pecking order.

And so the drama of the universe unfolds as a struggle between rampant
selfishness and kenotic beneficence -- from the atomic scale all the way
to universal dimensions. It all begins with mutual beneficence, but it
evolves/devolves into complex interplay of phenomena to which we assign
contrasting normative values.

Peace to all,
Bob U.

> The Force of History--Howard Bloom
>
>
> In 1995, I published my first  book, The Lucifer Principle: a Scientific
> Expedition Into the Forces of  history.  It sold roughly 140,000  copies
> worldwide and is still selling.  Some people call it their Bible.  Others
> say
> that it was the book that predicted 9/11.  And less than two months ago,
> on
> November 13, 2015, some current readers said it was the book that
> explained
> ISIS’ attacks on Paris.  Why?  What are the forces of history?  And what
> do
> they have to do with  information science?
> The Lucifer Principle uses  evolutionary biology, group selection,
> neurobiology, immunology, microbiology,  computer science, animal
> behavior, and
> anthropology to probe mass passions, the  passions that have powered
> historical
> movements from the unification of China in  221 BC and the start of the
> Roman  Empire in 201 BC  to the rise  of the Empire of Islam in 634 AD and
> that
> empire’s modern manifestations, the  Islamic Revolutionary Republic of
> Iran
> and ISIS, the Islamic State, a group  intent on establishing a global
> caliphate.  The Lucifer Principle concludes that the passions that swirl,
> swizzle,
>  and twirl history’s currents are a secular trinity.  What are that
> trinity’
> s three  components?  The superorganism, the  pecking order, and ideas.
> What’s a superorganism?  Your body is an organism. But it’s also  a
> massive social gathering.  It’s  composed of a hundred trillion cells.
> Each of
> those cells is capable of living on its own.  Yet your body survives
> thanks to
> the  existence of a collective identity—a you.  In 1911,_[i]_
> (file:///C:/cnt/the%20new%20forces%20of%20history%20for%20pedro%20marijuna%20and%20the%20f
> oundations%20of%20information%20science%2012-24-2015.docx#_edn1)   Harvard
> biologist William Morton Wheeler noticed that ant colonies pull off the
> same trick.  From 20,000 to 36  million ants work together to create an
> emergent property, a collective  identity, the identity of a community, a
> society,
> a colony, or a  supercolony.  Wheeler observed how  the colony behaved as
> if
> it were a single organism.  He called the result a
> “superorganism.”_[ii]_
> (file:///C:/cnt/the%20new%20forces%20of%20history%20for%20pedro%20marijuna%2
> 0and%20the%20foundations%20of%20information%20science%2012-24-2015.docx#_edn
> 2)
> Meanwhile in roughly 1900, when  he was still a child, Norway’s Thorleif
> Schjelderup Ebbe got into a strange  habit: counting the number of pecks
> the
> chickens in his family’s flock landed on  each other and who pecked
> whom.  By
>  the time he was ready to write his PhD dissertation in 1918, Ebbe had
> close to  20 years of data.  And that data  demonstrated something
> strange.
> Chickens in a barnyard are not egalitarian.  They have a strict hierarchy.
>  At
> the top is a chicken who gets special  privileges.   All others step
> aside
> when she goes to the trough.  She is the first to eat.  And  she can peck
> any other chicken in the group.  Then comes chicken 

Re: [Fis] Sustainability through multilevel research: The Lifel, Deep Society Build-A-Thon - 1

2015-12-11 Thread Robert E. Ulanowicz
Dear Nikhil,

As regards ecosystems, some 20% or so of bound energy is retained via
cycling, but the primary function of cycling as conservator is with
limiting elements. Often 70+% of necessary elements are retained via
cycling within the system. This becomes very evident when one regards
coral reefs, where nutrients within the reef are far richer than in the
desert ocean that surrounds them.

The best,
Bob

> At level 1 (molecular self-organiztion)- solar energy is stored in the
> high-energy reduced molecules. Do you see a possibility that living
> systems could store energy in cycles involving less stable species at the
> two other levels (level 2, and 3) as well? (When I speak of stored energy,
> I am referring to stored-energy as introduced by Mclare, and discussed by
> Ulanowicz and Ho [Sustainable Systems as Organisms?, BioSystems 82 (2005)
> 39–51].


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Sustainability through multilevel research

2015-11-23 Thread Robert E. Ulanowicz
Dear Pedro & Nikhil:

Thank you for taking us once more into the realm of economics and ecology!

Since 2003, perhaps the most significant new insight to me has been the
discovery that ecological systems do not (and cannot) run at maximal
efficiency. A degree of efficiency is required for resource utilization,
but too much can lead to system collapse -- the system becomes too
dependent on the few most efficient pathways and loses alternative, less
efficient routes that can replace the major players when they are impacted
by novel disturbance. Systems that endure have achieved a balance between
efficiency and reliability.

This pertains to IT, because the Bayesian decomposition of the Shannon
diversity allows one to track the complementary attributes of network
efficiency and reliability. Data are still insufficient but it appears
that ecosystems favor reliability over efficiency. See, for example, Fig.7
on p1890 of <http://people.clas.ufl.edu/ulan/files/Dual.pdf>, where most
systems cluster around 40% efficient order and 60% residual diversity that
can function as "strength in reserve".

In applying the IT calculus, it is important to keep in mind that
information theory is predicated upon quantifying something that is
*missing* (the Shannon diversity) as prelude to quantifying that which
exists (the mutual information). Conventional science very rarely
addresses that which is missing, but with IT it can be quantified
<http://people.clas.ufl.edu/ulan/files/FISPAP.pdf> and manipulated as a
part of ecological management. (See pp. 50-51 in
<http://people.clas.ufl.edu/ulan/files/Methods2.pdf>.)

The possible economic implications of these observations are enormous.
Classical economics is centered around making the economic system as
efficient as possible. Market efficiency is the sine-qua-non of economics,
but the likelihood is that going too far down that road leads to collapse
-- witness the boom-bust cycles of the capital system. Might there not be
some balance in economic systems akin to that which nature seems to be
revealing?

Bernard Lietaer has been promoting the use of alternative currencies for a
long while now, but was receiving heavy criticism from his fellow
economists, who countered that multiple currencies degrade market
efficiency. But perhaps that is exactly what is needed to make our
economic system less volatile?

A few publications giving more details of this argument include:

<http://people.clas.ufl.edu/ulan/files/ECOCOMP2.pdf>

<http://people.clas.ufl.edu/ulan/files/Goerner.pdf>

<http://people.clas.ufl.edu/ulan/files/Lietaer.pdf>

<http://people.clas.ufl.edu/ulan/files/Crisis.pdf>

These notions have already attracted the attention of the French Ministry
of Economics, and our colleague, Bernard Lietaer, has been busy extolling
the connection all across the European academy.

Peace to all,
Bob Ulanowicz

---------
Robert E. Ulanowicz|  Tel: +1-352-378-7355
Arthur R. Marshall Laboratory  |  FAX: +1-352-392-3704
Department of Biology  |  Emeritus, Chesapeake Biological Lab
Bartram Hall 110   |  University of Maryland
University of Florida  |  Email <u...@umces.edu>
Gainesville, FL 32611-8525 USA |  Web <http://www.cbl.umces.edu/~ulan>
--


> Dear Nikhil and FIS Colleagues,
>
> Thanks for the thought-provoking opening. Actually even a superficial
> reading of all the stuff you have recommended us becomes quite a bit of
> hard work--but it pays. For my taste, the paper on "Part Three" contains
> the most essential new thinking.  Perhaps the excessive reliance on
> systems-systems parlance is not convenient, both from a rhetorical and a
> conceptual point of view. It gives the impression of a reductionist
> Procrustean Bed where all the (endless!) stuff not amenable to the
> ongoing treatment becomes eliminated or treated as nonexistent. But it
> is a matter, maybe, just of style, that can be conveniently reformulated
> . Notwithstanding these trifle comments about form, the contents are
> significant and timely (at least for my personal taste!).
>
> About contents, again I will start with a few mild criticisms to the
> particular scheme proposed where mycorrhiza and gut bacteria appear as
> central modulators. I can be wrong, of course, as I think you have taken
> new arenas of research (still unsettled) and somehow put them on an
> argumentative extreme, not much reliable actually. But the discussion in
> Section 3.5 about new avenues for aligning ecosystems and economic
> systems is full of valuable insights. I get along with it (with
> secondary nuances) and so it allows me to respond explicitly to your
> question 3 below, quite positively. Co

Re: [Fis] Shannon-Weavers' Levels A, B, C.

2015-10-14 Thread Robert E. Ulanowicz

> On 2015-10-14, at 12:38 PM, Marcus Abundis wrote:
>
>> RE Mark Johnson's post of Thu Oct 1 09:47:13 on Bateson and imagination
Two quick remarks:

1. It's not at all clear to me that C is subsumptive of B.

2. I would lobby for Shannon/Bayesian relationships as an intermediary
between A. and B (i.e., preliminary to "meaning").

Cheers to all,
Bob U.

>> . . .
>>  – Me Too!
>>
>> RE Loet & Stan's postings beginning Thu Oct 1 21:19:50 . . .
>> >  I suggest to distinguish between three levels (following Weaver): <
>> > A. (Shannon-type) information processing ; <
>> > B. meaning sharing using languages;<
>> > C. translations among coded communications.<
>> > So, here we have a subsumptive hierarchy"<
>>
>> I was wondering if this note means to imply an *all inclusive* list of
>> traits to be considered in modeling information? Or, alternatively . . .
>> what would such an all inclusive list look like?
>>
>> Thanks!
>>
>>
>>
>> Marcus Abundis
>> about.me/marcus.abundis


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] [Fwd: Re: Information is a linguistic description of structures]--T...

2015-09-29 Thread Robert E. Ulanowicz
Howard:

I applaud your critique of our legacy attempts to render life meaningful
in terms of what you call "necrophilia" and Hans Jonas has called an
"ontology of death".

In my last book, "A Third Window", I attempted to develop the metaphysics
of a process ecology of relationships as an alternative starting point.


I especially resonated with your mention of the failure of conventional
and relativistic physics to explain the spiral arms of some galaxies. This
I believe is due to the constraints of the continuum assumption laid down
by Euler and Leibniz, which conflates cause with effect. One can get away
with this assumption so long as the interval between cause and effect is
virtually immediate. In a galaxy 100,000 light years in diameter, however,
this assumption begins to fray. It likely breaks down altogether across
intergalactic distances.

The continuum assumption leads to symmetrical laws of nature, and as
Noether taught us, symmetry and conservation are joined at the hip. Is it
any wonder, then, that inconsistencies leading to the postulation of
"dark" matter and energy should arise if one uses only symmetrical laws?

What is known to few is that Newton (who ironically gets a lot of the
blame for the Eulerian assumption) inveighed strongly against equating
cause with effect. Historian of science, Ed Dellian, gives the full story
on his website.  I offer
some consequences in my talk at "Seizing an Alternative", which took place
back in June.


Having thus waxed ebullient over your insights, I nonetheless tend to
agree with Terry that discussion on communication or information should
not be confined to language or genomics. In fact, I would contend that
information should not be limited to association with communication. As
Stan Salthe contends, it is more generally tied to any form of constraint.
John Collier, for example, identifies such information as inheres in
structures as "enformation", and this form is readily quantifiable using
the information calculus of Shannon.
 Such reckoning permits
us to develop an alternative phenomenology to the "dead objects moving
according to universal laws"  attempts to apprehend life.


Prodded by Jonas, we need to give intensive effort to articulating an
"ontology of life".

Peace,
Bob U.

>
> re: it is likely to be problematic to use language as the paradigm model
> for all communication--Terrence Deacon
>
> Terry  makes interesting points, but I think on this one, he may be
> wrong.
> Guenther Witzany is on to something.  our previous  approaches  to
> information have been what Barbara Ehrenreich, in her  introduction to the
> upcoming
> paperback of my book The God Problem: How a Godless  Cosmos Creates, calls
> "a kind of unacknowledged necrophilia."
>
> we've been using dead things to understand living things.  aristotle  put
> us on that path when he told us that if we could break things down to
> their
> "elements" and understand what he called the "laws" of those elements,
> we'd
> understand everything.  Newton took us farther down that path when he said
> we could understand everything using the metaphor of the "contrivance,"
> the
>  machine--the metaphor of "mechanics" and of "mechanism."
>
> Aristotle and Newton were wrong.  Their ideas have had centuries to  pan
> out, and they've led to astonishing insights, but they've left us blind
> to
> the relational aspect of things. utterly blind.
>
> the most amazing metaphor of relationality available to us is not math,
> it's not mechanism, and it's not reduction to "elements," it's language.
> by
> using the metaphor of a form of language called "code," watson and  crick
> were able to understand what a strand of dna does and  how.   without
> language
> as metaphor, we'd still be in the dark  about the genome.
>
> i'm convinced that by learning the relational secrets of the body of work
> of a Shakespeare or a Goethe we could crack some of the secrets we've been
> utterly unable to comprehend, from what makes the social clots we call a
> galaxy's spiral arms (a phenomenon that astronomer Greg Matloff, a  Fellow
> of
> the British interplanetary Society,  says defies the laws  of Newtonian
> and
> Einsteinian physics) to what makes the difference between  life and death.
>
> in other words, it's time we confess in science just how little we know
> about language, that we explore language's mysteries, and that we use our
> discoveries as a crowbar to pry open the secrets of this highly
> contextual,
> deeply relational, profoundly communicational cosmos.
>
> with thanks for tolerating my opinions.
>
> howard
>
> 
> Howard Bloom


___

Re: [Fis] Information and Locality Introduction

2015-09-11 Thread Robert E. Ulanowicz
I'll have to weigh in with Stan on this one. Stan earlier had defined
information more generally as "constraint". It is convenient to employ the
IT calculus to separate constraint from indeterminacy. This is possible in
complete abstraction from anything to do with communication.

The ability to make this separation has wide-ranging consequences. For
example, it provides a pathway by which process philosophy can be brought
to bear on quantitative physical systems! It is no longer necessary to
rely solely on positivist "objects moving according to law". That's no
small advance!



The best,
Bob

> Pedro wrote"
>
>>Most attempts to enlarge informational thought and to extend it to life,
> economies, societies, etc. continue to be but a reformulation of the
> former
> ideas with little added value.
>
> S: Well, I have generalized the Shannon concept of information carrying
> capacity under 'variety'...  {variety {information carrying capacity}}.
> This allows the concept to operate quite generally in evolutionary and
> ecological discourses.  Information, then, if you like, is what is left
> after a reduction in variety, or after some system choice.  Consider
> dance:
> we have all the possible conformations of the human body, out of which a
> few are selected to provide information about the meaning of a dance.
>
> STAN
>
> STAN
>
> On Fri, Sep 11, 2015 at 8:22 AM, Pedro C. Marijuan <
> pcmarijuan.i...@aragon.es> wrote:
>
>> Dear Steven and FIS colleagues,
>>
>> Many thanks for this opening text. What you are proposing about a pretty
>> structured discussion looks a good idea, although it will have to
>> confront the usually anarchic discussion style of FIS list! Two aspects
>> of your initial text have caught my attention (apart from those videos
>> you recommend that I will watch along the weekend).
>>
>> First about the concerns of a generation earlier (Shannon, Turing...)
>> situating information in the intersection between physical science and
>> engineering. The towering influence of this line of thought, both with
>> positive and negative overtones, cannot be overestimated. Most attempts
>> to enlarge informational thought and to extend it to life, economies,
>> societies, etc. continue to be but a reformulation of the former ideas
>> with little added value. See one of the last creatures: "Why Information
>> Grows: The Evolution of Order, from Atoms to Economies" (2015), by Cesar
>> Hidalgo (prof. at MIT).
>>
>> In my opinion, the extension of those classic ideas to life are very
>> fertile from the technological point of view, from the "theory of
>> molecular machines" for DNA-RNA-protein matching to genomic-proteomic
>> and other omics'  "big data". But all that technobrilliance does not
>> open per se new avenues in order to produce innovative thought about the
>> information stuff of human societies. Alternatively we may think that
>> the accelerated digitalization of our world and the cyborg-symbiosis of
>> human information and computer information do not demand much brain
>> teasing, as it is a matter that social evolution is superseding by
>> itself.
>>
>> The point I have ocasionally raised in this list is whether all the new
>> molecular knowledge about life might teach us about a fundamental
>> difference in the "way of being in the world" between life and inert
>> matter (& mechanism & computation)---or not. In the recent compilation
>> by Plamen and colleagues from the former INBIOSA initiative,  I have
>> argued about that fundamental difference in the intertwining of
>> communication/self-production, how signaling is strictly caught in the
>> advancement of a life cycle  (see paper "How the living is in the
>> world"). Life is based on an inusitate informational formula unknown in
>> inert matter. And the very organization of life provides an original
>> starting point to think anew about information --of course, not the only
>> one.
>>
>> So, to conclude this "tangent", I find quite exciting the discussion we
>> are starting now, say from the classical info positions onwards, in
>> particularly to be compared in some future with another session (in
>> preparation) with similar ambition but starting from say the
>> phenomenology of the living. Struggling for a
>> convergence/complementarity of outcomes would be a cavalier effort.
>>
>> All the best--Pedro
>>
>>
>>
>> Steven Ericsson-Zenith wrote:
>>
>>> ...The subject is one that has concerned me ever since I completed my
>>> PhD
>>> in 1992. I came away from defending my thesis, essentially on large
>>> scale
>>> parallel computation, with the strong intuition that I had disclosed
>>> much
>>> more concerning the little that we know, than I had offered either a
>>> theoretical or engineering solution.
>>> For the curious, a digital copy of this thesis can be found among the
>>> reports of CRI, MINES ParisTech, formerly ENSMP,
>>> 

Re: [Fis] Answer to Mark. Phenomenology and Speculative Realism

2015-08-01 Thread Robert E. Ulanowicz
Dear Joseph et al.,

I'm afraid I can't comment on the adequacy Husserlian phenomenology, as I
never could get very far into Hursserl. I would just add that there is
also a variety of phenomenology associated with thermodynamics and
engineering.

The generic meaning of phenomenology is the study of phenomena in
abstraction of their eliciting causes. This applies to almost all of
classical thermodynamics and much of engineering. The idea is to describe
the behavior of systems in quantitative fashion. If the resulting
mathematical description proves reliable, it becomes a phenomenological
description. PV=nRT is such a description. Too often physicists try to
identify thermodynamics with statistical mechanics, an action that is
vigorously eschewed by engineers, who claim the field as their own.

I have spent most of my career with the phenomenology of quantified
networks, where phenomena such as intersubjectivity (if I correctly
understand what is meant by the term) thoroughly pervades events.

Of course, I'm feathering my own nest when I say that I believe that the
only *current* fruitful way to approach systems biology is via such
phenomenology! (See Section 3 in
http://people.clas.ufl.edu/ulan/files/Reckon.pdf.)

The best,
Bob

 Dear Mark,

 Thank you for this note, which points correctly to the fact that there was
 something missing in the debate. Intersubjectivity is a good word for it,
 but phenomenology in general is probably no longer the answer, if it ever
 was. Check out the new book by Tom Sparrow, The End of Phenomenology,
 Edinburgh, 2014; Sparrow is a key player in a new 'movement' called
 Speculative Realism which is proposed as a replacement.

 What does this have to do with information? I think a great deal and worth
 a new debate, even in extremis. The problem with Husserlian phenomenology
 is that it fails to deliver an adequate picture of reality, but
 speculative realism is too anti-scientific to do any better. What I think
 is possible, however, is to reconcile the key insights of Heidegger with
 science, especially, with information science. This places information
 science in a proper intersubjective context where its utility can be seen.
 For discussion, I hope.

 Best,

 Joseph


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Answer to the comments made by Joseph

2015-07-27 Thread Robert E. Ulanowicz
Folks

I know there is a long legacy of equating information with entropy, and
dimensionally, they are the same. Qualitatively, however, they are
antithetical. From the point of view of statistical mechanics, information
is a *decrease* in entropy, i.e., they are negatives of each other.

This all devolves back upon the requirement that *both* entropy and
information require a reference state. (The third law of thermodynamics.)
Once a reference distribution has been identified, one can then quantify
both entropy and information. It actually turns out that against any
reference state, entropy can be parsed into two components, mutual
information and conditional (or residual) entropy. Change the reference
state and the decomposition changes.
http://people.clas.ufl.edu/ulan/files/FISPAP.pdf (See also Chapter 5 in
http://people.clas.ufl.edu/ulan/publications/ecosystems/gand/.)

Cheers to all,
Bob

 Folks,

 Doing dimensional analysis entropy is heat difference divided by
 temperature. Heat is energy, and temperature is energy per degree of
 freedom. Dividing, we get units of inverse degrees of freedom. I submit
 that information has the same fundamental measure (this is a consequence
 of Scott Muller¡¯s asymmetry principle of information. So fundamentally we
 are talking about the same basic thing with information and entropy.

 I agree, though, that it is viewed from different perspectives and they
 have differing conventions for measurement.

 I agree with Loet¡¯s other points.

 John


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Information Foundation of the Act--F.Flores L.deMarcos

2015-07-26 Thread Robert E. Ulanowicz
Dear Dr. Marcos-Ortega:

Are you aware of the algorithm to remove cycles from a weighted digraph?

http://people.clas.ufl.edu/ulan/files/Cyclng83.pdf

It is available as a DOS routine
http://www.cbl.umces.edu/~ulan/ntwk/network.html (see NETWRK4.2) or in R
for Windows https://cran.r-project.org/web/packages/enaR/enaR.pdf (see
enaCycle).

Sincerely,
R. Ulanowicz


P.S. Actually, in ecology, we discovered that cycles actually indicate the
domains of control.

 Hi Jerry,

 Thank you for your comments and questions. Also apologies for late
 response but I was on a trip.

 As for your question about cycles, actions can certainly be modeled as
 graphs that include cycles. We restricted our characterization to trees
 because:
 a) cycles can imply infinite loops that in our opinion are not appropriate
 to model human actions
 b) even considering cycles a set of actions can still be modeled a as a
 tree, so we consider that loops add unnecessary complexity to the model

 Furthermore, we understand the products of cyclic social processes like
 methods and procedures as technologies that embed information. All in all,
 we agree that a characterization of graphs can be useful to quantify
 information embedded in artifacts.

 Regards,

 Luis de Marcos Ortega
 Dpto Ciencias de la Computación   Computer Science Department
 Universidad de AlcaláUniversity of
 Alcalá
 http://www.uah.es/pdi/luis_demarcos

 Education, n. That which discloses to the wise and disguises from the
 foolish their lack of understanding. Ambrose G. Bierce.

 From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Jerry LR
 Chandler
 Sent: jueves, 23 de julio de 2015 17:02
 To: FIS Science fis@listas.unizar.es
 Subject: Re: [Fis] Information Foundation of the Act--F.Flores 
 L.deMarcos


 List:

 This comment is restricted to the proposed use of mathematical structures
 in context of the social.

 The mathematical structure of a tree is restricted by the notion of a
 cycle.
 A tree is readily converted into a cycle by simply adding a new edge
 between leaves or joints.
 The simple logic of a tree is lost by including cyclic relations.

 It appears to me that the rhetorical arguments may include inferences
 requiring cycle relations.

 What would be the nature of the inferences if the hypotheses allowed for
 cyclic social processes, such as learning on the basis of annual
 agricultural or hunting cycles?

 Cheers

 Jerry



 ___
 Fis mailing list
 Fis@listas.unizar.es
 http://listas.unizar.es/cgi-bin/mailman/listinfo/fis



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] It From Bit video

2015-05-26 Thread Robert E. Ulanowicz
I would like to strongly reinforce John's comments about boundary
conditions. We tend to obsess over the laws and ignore the boundary
statements. (Sort of a shell game, IMHO.) If boundary conditions cannot be
stated in closed form, the physical problem remains indeterminate! (The
aphorism from computer science, Garbage in, garbage out! is appropriate
to reversible laws as well.)

Then there is the issue of the continuum assumption, which was the work of
Euler and Leibniz, not Newton. Newton argued vociferously against it,
because it equated cause with effect. The assumption works quite well,
however, whenever cause and effect are almost simultaneous, as with a
force impacting an object, where the force is transmitted over small
distances at the speed of light. It doesn't work as well when large
velocities are at play (relativity) or very small distances and times
(quantum phenomena) -- whence the need arose to develop the exceptional
sciences, thermodynamics, relativity and quantum physics.

I would suggest it doesn't work well at very large distances, either.
Consider galaxies, which are on the order of 100,000 or more light years
in diameter. (I was surprised to learn recently that we really don't have
decent models for the dynamics of galaxies.) Gravitational effects are
relatively slow to traverse those distances, so that cause and effect are
not immediate. (Sorry, I don't think quantum entanglement is going to
solve this conundrum.) If cause and effect are widely separated, then the
continuum assumption becomes questionable and by implication,
reversibility as well. Now Noether demonstrated that reversibility and
conservation are two sides of the same coin. So I see it as no great
mystery that we encounter problems with conservation of matter and energy
at galactic scales or higher -- witness dark matter and dark energy.

Of course, I am neither a particle physicist nor an astrophysicist, but
merely someone writing from my armchair. So I invite anyone on FIS to put
me straight as regards my speculations on these issues.

Cheers,
Bob U.


 Interesting question, Ken. I was not overly impressed with the video
 because it didn’t explain one of the most crucial points about the use
 of information in dealing with quantum gravity, for which we as yet have
 no good theory. The issue with both black holes and the origin of the
 universe process is that the boundary conditions are dynamical. You can
 have as many laws as you could want and still not have a physics if the
 boundary conditions are ignored. Usually they are added in as an initial
 state, or sometimes ad hoc but when they are changing, especially if they
 are mathematically inseparable from the laws, there is a problem with
 relying on the laws alone to explain. With black holes there is a question
 of whether or not information disappears at their event horizon. There is
 a similar issue for the observable portion of the universe at any given
 time. It is hard to see how the questions can even be posed without
 referring to information. Any boundary in basic physics can be conceived
 the same way, and if all masses and energies come from geometry (in a
 Unified Theory) then information is all there is in basic physics.

 I have argued for some time now that biological systems are much more
 defined by their boundary conditions, which are typically dynamical and
 changing, than by their energy flows, so information flows dominate,
 though energy flows place limits, so I have talked of the information and
 energy budgets being partially decoupled in biological systems. So
 information is important to biology because understanding its flow can
 answer questions about dynamical boundaries, just like in basic physics.
 The energy (and matter) flows I will leave to the biophysicists, but the
 paragraph above suggests that these are information flows as well. I like
 the potential for unification here.

 Cheers,
 John

 From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Ken Herold
 Sent: May 26, 2015 12:30 AM
 To: fis
 Subject: [Fis] It From Bit video

 Released recently--what about the biological?

 https://www.youtube.com/watch?v=-ATWa2AEvIY

 --
 Ken
 ___
 Fis mailing list
 Fis@listas.unizar.es
 http://listas.unizar.es/cgi-bin/mailman/listinfo/fis



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What are information and science?

2015-05-20 Thread Robert E. Ulanowicz
Dear Dai:

To say that molecules only interact directly is to ignore the metabolic
matrix that constitutes the actual agency in living systems. For example,
we read everywhere how DNA/RNA directs development, when the molecule
itself is a passive material cause. It is the network of proteomic and
enzymatic reactions that actually reads, interprets and edits the
primitive genome. Furthermore, the structure of that reaction complex
possesses measurable information (and complementary flexibility).

Life is not just molecules banging into one another. That's a physicist's
(Lucreatian) view of the world born of models that are rarefied,
homogeneous and (at most) weakly interacting. (Popper calls them vacuum
systems.) The irony is that that's not how the cosmos came at us! Vacuum
systems never appeared until way late in the evolution of the cosmos. So
the Lucreatian perspective is one of the worst ways to try to make sense
of life. We need to develop a perspective that parallels cosmic evolution,
not points in the opposite direction. To do so requires that we shift from
objects moving according to universal laws to processes giving rise to
other processes (and structures along the way).

The contrast is most vividly illustrated in reference to the origin of
life. Conventional metaphysics requires us to focus on molecules, whereby
the *belief* is that at some point the molecules will miraculously jump up
and start to live (like the vision of the Hebrew prophet Ezekiel). A
process-oriented scenario would consist of a spatially large cycle of
complementary processes (e.g., oxidation and reduction) that constitutes a
thermodynamic work cycle. Those processes then can give rise to and
support smaller cycles, which eventually develop into something resembling
metabolic systems. A far more consistent progression!

Of course, this view is considered catastrophically heterodox, so please
don't repeat it if you don't already have tenure. ;-)

Peace,
Bob U.

  I see two distinct cases:

 Case 1: For molecules 'communication' consists of interaction between
 the molecules themselves, resulting in biology.
 Similarly, for atoms 'communication' consists of interaction between the
 atoms themselves. They bang into each other and exchange their components.

 Case 2: For words and sentences (in my view of the world) it is human
 beings who communicate, not words and sentences. From a Maturana
 perspective, language is a recursive coordination between autopoietic
 entities, not interaction between linguistic items.

 In case 1, there is no mediating domain. Molecules and atoms interact
 directly.

 But in case 2, there is a hierarchy. Communication is between human
 beings, but interaction is through words and sentences in a linguistic
 domain. When I respond to your email, I do not have an effect on that
 email. Rather, I hope to have an effect on your thought processes.


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Information-as-Process

2014-12-12 Thread Robert E. Ulanowicz
Dear Ken  Pedro,

Unfortunately, I have not read Friston's thesis. In his abstract he
writes, Furthermore, if we look closely at what is optimized, the same
quantity keeps emerging, namely value (expected reward, expected utility)
or its complement, surprise (prediction error, expected cost). This is the
quantity that is optimized under the free-energy principle...

When one is dealing with complementary values, optimization usually infers
some balance. Is this what Friston means? If so, that is quite similar to
the balance between mutually-exclusive attributes that we observe when we
apply Shannon-like calculus to ecosystem trophic exchange networks. (See
Fig 7 on p1890 in http://people.biology.ufl.edu/ulan/pubs/Dual.pdf.)
Very interesting!

Bob U.

 Nature Reviews Neuroscience 11, 127-138 (February 2010) | doi
 :10.1038/nrn2787
 http://www.nature.com/nrn/journal/v11/n2/full/nrn2787.html

 :)  Ken

 On Fri, Dec 12, 2014 at 8:19 AM, Pedro C. Marijuan 
 pcmarijuan.i...@aragon.es wrote:

 Dear Loet, Steven, and colleagues,

 During last ten years or so, with particular success in most recent
 years,
 Karl Friston has developed his free energy optimization principle, based
 on
 Shannon's information theory and optimal control theory as well as on
 the
 Bayesian brain hypothesis. I think this is the most advanced work
 towards a
 unified brain theory today. The minimization dynamics of the cerebral
 free
 energy construct (it is a sort of Helmoltz program revisited) becomes a
 generative process of perception, action, learning and adaptive
 behaviors
 in general. The 2010 paper (Nature Reviews Neurosceince, doi:
 10.138/nrn2787) where he precisely argues about a unified brain theory,
 is
 quite representative of his proposals. On a personal basis, during last
 two
 decades I was following and cooperating with Kenneth Paul Collins (we
 published a book in Spanish about the emergence of behavior from brain
 dynamics). Our scheme was based on the minimization of a collective
 variable supposedly a sort of entropy of excitation/inhibition ratios
 topologically distributed among neuronal surfaces of the cortex that was
 performed essentially by the medial parts of the brain. Although very
 rich
 in qualitative and behavioral aspects, the formal part was too weak
 (awfully weak). Until recent years I could not connect meaningfully
 Collin's approach with other works, and unfortunately he left scientific
 research long ago--but now the marriage with Friston's is remarkable.
 Putting them together may be a very fertile exploratory avenue.

 best ---Pedro

 Loet Leydesdorff wrote:


 Dear Steven and colleagues,


 I did not (yet) study your approach. Is there a paper that can be read
 as
 an introduction?


 It seems to me that one can distinguish between formal and substantial
 theories of information. Shannon’s mathematical theory is a formal
 apparatus: the design and the results do not yet have meaning without
 an
 interpretation in a substantial context. On the other side, a theory
 about,
 for example, neuro-information is a special theory. One can in this
 context
 use information theory as a statistical tool (among other tools).
 Sometimes, one can move beyond description. J


 The advantage of information theory, from this perspective of special
 theories, is that the formal apparatus allows us sometimes to move
 between
 domains heuristically. For example, a model of the brain can perhaps be
 used metaphorically for culture or the economy (or vice versa). The
 advantages have to be shown in empirical research: which questions can
 be
 addressed and which puzzles be solved?


 Best,

 Loet


 

 Loet Leydesdorff

 /Emeritus/ University of Amsterdam
 Amsterdam School of Communications Research (ASCoR)

 l...@leydesdorff.net mailto:l...@leydesdorff.net;
 http://www.leydesdorff.net/
 Honorary Professor, SPRU, http://www.sussex.ac.uk/spru/University of
 Sussex;

 Guest Professor Zhejiang Univ. http://www.zju.edu.cn/english/,
 Hangzhou; Visiting Professor, ISTIC, http://www.istic.ac.cn/Eng/
 brief_en.htmlBeijing;

 Visiting Professor, Birkbeck http://www.bbk.ac.uk/, University of
 London;

 http://scholar.google.com/citations?user=ych9gNYJhl=en 
 http://scholar.google.com/citations?user=ych9gNYJhl=en


 *From:* stevenzen...@gmail.com [mailto:stevenzen...@gmail.com] *On
 Behalf Of *Steven Ericsson-Zenith
 *Sent:* Tuesday, December 09, 2014 10:13 PM
 *To:* l...@leydesdorff.net
 *Cc:* Joseph Brenner; fis
 *Subject:* Re: [Fis] Information-as-Process


 The problem with this approach (and approaches like it) is that it is
 descriptive and not explanatory. The distribution of the shape, in my
 model, can be described, perhaps, but the process or action decision
 point
 and response covariance is impossible to consider.

 It is for this reason that I use holomorphic functors and
 hyper-functors
 in which I can express the explicit role of a 

Re: [Fis] Neuroinformation?

2014-12-03 Thread Robert E. Ulanowicz
Dear Dr. Isiegas:

I envision neuroinformation as the mutual information of the neuronal
network where synaptic connections are weighted by the frequencies of
discharge between all pairs of neurons. This is directly analogous to a
network of trophic exchanges among an ecosystem, as illustrated in
http://people.biology.ufl.edu/ulan/pubs/SymmOvhd.PDF.

Please note that this measure is different from the conventional
sender-channel-receiver format of communications theory. It resembles more
the structural information inhering in the neuronal network. John
Collier (also a FISer) calls such information enformation to draw
attention to its different nature.

With best wishes for success,

Bob Ulanowicz

 Dear list,

 I have been reading during the last year all these interesting
 exchanges. Some of them terrific discussions! Given my scientific
 backgound
 (Molecular Neuroscience), I would like to hear your point of view on the
 topic of neuroinformation, how information exists within the Central
 Nervous Systems. My task was experimental; I was interested in
 investigating the molecular mechanisms underlying learning and memory,
 specifically, the role of the cAMP-PKA-CREB signaling pathway in such
 brain
 functions (In Ted Abel´s Lab at the University of Pennsylvania, where I
 spent 7 years). I generated several genetically modified mice in which I
 could regulate the expression of this pathway in specific brain regions
 and
 in which I studied the effects of upregulation or downregulation at the
 synaptic and behavioral levels. However, I am conscious that the
 information flow within the mouse Nervous System is far more complex
 that
 in the simple pathway that I was studying...so, my concrete question for
 you Fishers or Fisers, how should we contemplate the micro and macro
 structures of information within the neural realm? what is
 Neuroinformation?

 Best wishes,


 --
 Carolina Isiegas
 ___
 Fis mailing list
 Fis@listas.unizar.es
 http://listas.unizar.es/cgi-bin/mailman/listinfo/fis



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Physical Informatics… (J.Brenner)

2014-10-20 Thread Robert E. Ulanowicz
Dear Bob,

What you are saying is the obverse of the third law of thermodynamics. The
third law says that entropy (viz., disorder) can only be measured with
respect to some reference condition. Since information is the complement
of entropy in Bayesian informatics, then the obverse becomes, Information
can only be measured with respect to some reference state. (It may be the
same one used for pin down entropy.) Changing the reference state changes
the values for both information and entropy.

I tried elaborating those relationships in my FIS paper
http://people.biology.ufl.edu/ulan/pubs/FISPAP.pdf.

The best,
Bob U.

 Dear all - my take on this post is that the question of whether physical
 processes are information is like the question: Is there a sound if a tree
 falls in the forest and no one is there to listen? This is like the Zen
 koan: what is the sound of one hand clapping If no one is in the forest
 are the trees information? Well for sure they are trees but as to whether
 or not they are information that is strictly dependent on the point of
 view of the respondent. For me they are just trees and here is why I think
 so. For me information is about a process. The noun information relates to
 the verb inform. If no one is being informed there is no information. In
 the same way that if no one or thing is there being loved (verb) their is
 no love (noun). If no one is engaged in the activity of loving (a verb)
 there is no love (a noun). If there is no one being informed (a verb) then
 there is no information (a noun). Now one can talk about an object or a
 phenomenon having the possibility of informing someone which to my mind is
 potential information which is what I would call the physical processes
 that take place in our universe. A book written in Urdu is potential
 information because an Urdu reader can be informed by it. For me as a
 non-Urdu speaker there is very little information other than someone went
 to the trouble of writing out a text with Urdu letters and hence there is
 probably information there for an Urdu speaker reasoning why would any one
 make the effort to create such an object unless that person wanted to
 inform Urdu speakers. Just as one person's food is another person's poison
 so it is that one person's information is just for another persons merely
 a physical phenomenon such as processes in nature, ink on paper, sounds or
 EM signals. Shannon developed a theory of signals in which some of those
 signals have the ability to inform some recipients. I hope this collection
 of words has informed you other than giving you the knowledge of my view
 as to what constitutes information. Thanks to Joseph, Pedro, and Igor for
 the opportunity to reflect on the nature of information. If you enjoyed my
 post and would like to learn more about my views on information please
 send me an email off line and I will send you an email version of my book
 What is Information?  Propagating Organization in the Biosphere, the
 Symbolosphere, the Technosphere and the Econosphere  for free. And now you
 know what an infomercial is. This was an infomercial because of my offer
 to share my book with you erudite scholars of FIS whose posts I always
 enjoy. With kind regards - Bob
 __

 Robert K. Logan
 Prof. Emeritus - Physics - U. of Toronto
 Chief Scientist - sLab at OCAD
 http://utoronto.academia.edu/RobertKLogan
 www.physics.utoronto.ca/Members/logan
 www.researchgate.net/profile/Robert_Logan5/publications








 On 2014-10-20, at 1:57 PM, PEDRO CLEMENTE MARIJUAN FERNANDEZ wrote:



 - Original Message -
 From: Joseph Brenner
 To: Igor Gurevich ; Pedro C. Marijuan ; fis
 Sent: Monday, October 20, 2014 8:40 AM
 Subject: Re: [Fis] Physical Informatics contains fundamental results
 which impossible to get only by physical methods

 Dear Igor, Dear Gerhard and Colleagues,

 In Igor's summary of his recent work, I read the following absoutely
 critical statement:
  It is shown that the expansion of the Universe is the source of
 information formation, wherein a variety of physical processes in an
 expanding Universe provide information formation. I take this as
 meaning that the expansion of the Universe as such does not produce
 information.

 Gerhard's formulation is slightly different (my paraphrase):
 The first assymetry in energy distribution, following the singularity,
 is the source of information formation.

 My question is, therefore, how best to combine these insights. For
 example, we may say that the variety of physical processes are all the
 consequence of, and subsequently reflect, a first assymetry.

 It is also interesting to note that the approaches of both Igor and
 Gerhard imply the emergence of information through the interactional
 impact (informational interactions) of fundamental forces on particles,
 extended by Gerhard to somewhat higher levels of organization (life)
 than Igor.

 I look forward to further discussion of these fundamental 

Re: [Fis] Fw: Responses

2014-01-21 Thread Robert E. Ulanowicz

 The reason of being of information, whatever its content or quantity, is
 to be used by an agent (biological or artificial).

Dear Christophe,

In making this restriction you are limiting the domain of information to
communication and excluding all information that inheres in structure
per-se. John Collier has called the latter manifestation enformation,
and the calculus of IT is quite effective in quantifying its extent.
Perhaps John would like to comment?

Cheers,
Bob U.


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] FW: Responses

2014-01-12 Thread Robert E. Ulanowicz
Dear Christophe,

I tried to qualify my use of meaning, but perhaps I wasn't clear enough.

In my example I wanted to say that I(A;B) is a quantity that can be
considered a proto-meaning of B to A. Another way of saying the same
thing is that I(A;B) quantifies A in the context of B.

I should have added that I don't confine my notion of information to the
scenario of communication. I feel that it's discovery in that context was
an historical accident. Rather, like Stan, I  consider information more
generally as constraint, and the information of communication becomes a
subset of the more general attribute.

Hence, anything that is held together by constraints is amenable in one
form or another to the Bayesian forms of Shannon capacity. The real
advantage in doing so is that the complement of the Bayesian information
is made explicit. Vis-a-vis constraint this complement becomes
flexibility. Such flexibility is an apophasis that is missing from most
scientific endeavors, but is essential to our understanding of evolution.

You are probably correct that my terminology is not orthodox and possibly
confusing to some. But I see such inconvenience as a small price to pay
for opening up a new window on quantitative evolutionary theory. I really
want folks to think outside the box of communication theory. What Shannon
started many (such as Terry Deacon) have prematurely cast aside. My
message is that we need to re-evaluate Shannon-type measures in their
Bayesian contexts. The have the potential of becoming very powerful
quantitative tools. (E.g.,
http://www.cbl.umces.edu/~ulan/pubs/EyesOpen.pdf.)

Peace,
Bob


 Bob,

 You seem to implicitly consider all information as being meaningful.

 I'm afraid such a position is source of confusion for a scientific
 approach to information.

 As we know, Shannon defined a quantity of information to measure the
 capacity
 of the channel carrying it. He did not address the “meaning” that the
 information may carry (as you write in your paper “Shannon information
 considers the amount of information,
 nominally in bits, but is devoid of semantics” ). Today DP  Telecom
 activities use Shannon type information without referring to its meaning.
 Considering “information” as being meaningful looks to me as potentially
 misleading and source of misunderstandings in our discussions. Information
 can
 be meaningful or meaningless. The meaning comes from the system that
 manages
 the information.

 Some research activities explicitly consider information as
 meaningful.data (see http://www.mdpi.org/entropy/papers/e5020125.pdf). I'm
 afraid such a position creates
 some vocabulary problems if we want to keep a scientific background when
 trying
 to understand what is “meaning”.

 The meaning of information is not something that exists by itself. The
 meaning
 of information is related to the system that manages the information
 (creates
 it or uses it). And the subject has to be addressed explicitly as systems
 can
 be animals, humans or machines. Focusing on meaning generation by a system
 can bring
 some clarification (see http://www.mdpi.org/entropy/papers/e5020193.pdf).

 Hope this helps

 Christophe





 From: lo...@physics.utoronto.ca
 Date: Sat, 11 Jan 2014 10:49:28 -0500
 To: gordana.dodig-crnko...@mdh.se
 CC: fis@listas.unizar.es
 Subject: Re: [Fis] Fw:  Responses

 Dear Friends - I have been lurking as far as the discussion of Shannon
 information is concerned. I must confess I am a bit confused by the use of
 the term information associated with what you folks call Shannon
 information. To my way of thinking Shannon produced a theory of signals
 and not information. Signals sometimes contain information and sometimes
 they do not which is exactly what Shannon said of his notion of
 information. Here is what troubles me: A set of random numbers according
 to Shannon has more information than a structured set of numbers like the
 set of even numbers. For me a random set of numbers contains no
 information. Now I am sure you will agree with me that DNA contains
 information and certainly more information than a soup of random organic
 chemicals which seems to contradict Shannon's definition of information. I
 would appreciate what any of you would make of this argument of mine. Here
 is another thought about information that I would love to have some
 comments on. The information in this email post to you folks will appear
 on multiple computers, it might be converted into ink on paper, it might
 be read aloud. The information is not tied to any particular physical
 medium. But my DNA cannot be emailed, printed out or in anyway separated
 from the physical medium in which it is instantiated. As far as I know it
 has been transferred in part to my 4 kids and 4 grandkids so far and there
 it stops for time being at least. The information in my DNA cannot be
 separated from the medium in which it is instantiated. This information is
 not symbolic. DNA is not a symbol of RNA but