Dear Jerry,

My apologies for taking so long to reply! I have been overwhelmed with
queries on email, civic and social obligations and responsibilities for
liturgical rubrics in my parish. I haven’t spent very much time online as
a result.

I sympathize entirely with your apprehensions, but I think they are
inapplicable here.

I share your concerns that homogeneous variables like mass, energy,
charge, etc. are inappropriate to heterogeneous situations. Gregory
Bateson made the distinction between the former, which he called “pleroma”
and the latter, which he characterized as “creatura”. He pointed out how
the former are insufficient to describe the latter. Later, Walter Elsasser
pointed out how the logic of the laws of physics, equivalent as they are
to operations on homogeneous sets, does not apply to heterogeneous
systems, especially biological ones.
<http://www.vordenker.de/elsasser/we_logic-biol.pdf>

You are certainly correct in pointing out that one needn’t be concerned
only with living systems, in that the transition from physics to chemistry
already crosses this divide.

The problem is that I do not see the divide as being as dichotomous as you
portray it.

You are probably aware of the realm of chemical thermodynamics, where the
effort has been made to incorporate attributes of heterogeneity into
variables that characterize the entire system. For example, the Gibbs (and
Helmholz) free energies are defined for chemical reaction systems. Changes
in the various tokens will contribute to changes in the whole system
quantity.

It is an attempt to marry the two domains by folding the heterogeneous
tokens into a pseudo-homogeneous system function. Of course this is
precisely what the Shannon formula does. The key to my assertion is that
the Shannon variable can be decomposed WITH RESPECT TO A SECOND, REFERENCE
DISTRIBUTION into two terms – one which quantifies the order (amount of
constraint) that the two orders exert one each other and a second that
quantifies the freedom that the two enjoy from each other. The second
term, called the “conditional entropy” in information theory is actually a
better homolog to physical entropy than the Shannon formula.

In applying this calculus to arbitrary networks, Rutledge et al. (J.
theor. Biol. 57:355) showed true genius by identifying the first
distribution with the distribution of inputs into the nodes and the second
as the distribution of outputs from the same nodes.  Hence the mutual
information (total effective mutual constraints) becomes a measure of the
internal order in the system and the second is a surrogate for its
entropy. The key to my assertion is that these two terms are precisely
complementary, so that if one is somehow indeterminate, the other must be
likewise.

Now, I confess that I have taken a major liberty in identifying
statistical entropy with physical entropy. But the third law has its
homolog in information theory in the result that statistical entropy is
always relative. Whence, the inherent structural information of the
network must likewise always remain relative.

I further confess that I have always inveighed against identifying
physical entropy with statistical entropy. They are, however, accurate
homologs of one another, and that is all that I am claiming.

I remark in passing that the mapping of physical elements into the
integers is decidedly homomorphic and not isomorphic. The number 12 refers
to the number of protons in the nucleus of a carbon atom, nothing more.
There are a variety of isotopes, ionic and radical forms that also map
into the same integer. Each has its own properties that would factor into
any physical measurement on a mixture of these varieties.

So, in conclusion, I would readily agree that a thermodynamics that is
strictly confined to pleroma cannot fully illumine attributes of a
heterogeneous system. But contemporary thermodynamics is not so
constrained. Neither is information theory, nor is chemistry without its
own hidden heterogeneities.

As regards information theory, statistical entropy is always relative,
which forces its complementary information to be likewise.

Cheers,
Bob

> List, Bob:
>
>> On Mar 27, 2017, at 10:37 AM, Robert E. Ulanowicz <u...@umces.edu>
>> wrote:
>>
>> First off, that information is always relative is the obverse of the
>> third
>> law of thermodynamics. It cannot be otherwise.
>> <http://people.clas.ufl.edu/ulan/files/FISPAP.pdf
>> <http://people.clas.ufl.edu/ulan/files/FISPAP.pdf>>
>
> First off?
>
> I fear that I am rather skeptical about this assertion for a simple
> structural reason that illuminates the scientific inadequacy of
> thermodynamics as source of scientific apperceptions.
>
> The notion of a general mathematical form of information residing within
> the Third Law of Thermodynamics is difficult for me to image because of
> the chemical table of element assigns a unique physical structural form
> and mathematical count to each individual chemical element.
>
> The apprehension of matter in the theory of thermodynamics is constrained
> to use of the symbol for mass (as a continuous variable.)
> But, the electrical structural information content is different for each
> element of the table of elements.
>
> This fact is a major impediment to the application of thermodynamic
> principles to the chemical and biological sciences. The closure of the
> thermodynamic symbol system (P, V, T, F, G, S, m) excludes the direct use
> of chemical symbols within the entropic framework.
>
> Can anyone conjure up a compelling counter-argument to this line of
> argumentation?
> For example, does the chemical structure of DNA contain information
> (independent of temperature)?
>
> Cheers
>
> Jerry
>
> “The union of units unifies the unity.”
> “The disunion of the unity separates the units"
>


_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to