My responses to recent posts by Karl, Stan, Joe, Loet, Gavin, John, and Bob by 
the number of the digest that I rec’d. I seek to address several basic issues. 


First, I would repeat my assertion from my post of Jan. 20, 2010, along with 
Karl’s denial and my comments about his denials:


JLRC: The unspoken premise of many discussants appears to me to be a view of 
information theory as a  universal glue, a universal predicate, a universal 

KJ: The assertion is outspoken, explicit and apodictically declaratory: 
information theory IS a universal glue, a universal predicate, a universal code


Karl: Out-spoken?

JLRC:  Yes, I spoke-out.  :-)

Karl: Explicit? 

JLRC: Yes. Rosen argues that biology requires a separate symbol system, that is 
outside of mathematical category theory. My explicit response to the category 
theory approach to information theory is contained in three recent papers – 
Axiomathes, Discrete Applied Math, and a chapter in a book by Vrobel and Otto 
Rossler. If desired, I will forward copies of these papers to list members.

Karl: apodictically declaratory: 

JLRC: Yes!  By design.   ;-) 

JLRC:  Perhaps you have not considered the reasons why Shannon information 
lacks universality. So, Precisely what is it that you are denying about the 
appearances of information theory?  

·      That category theory is applicable to biology?

·      By inference, that set theory / predicate logic is sufficient to 
describe optical isomers?

·      That the simple “yes/No” choice essential to Shannon information is a 
universal code for human knowledge?  (The notion of a binary encoding of all 
information is denied by Dalton’s premise – the ostensive source of chemical 

·      Or, is it that you believe that addition is a universal operation of 

JLRC: Your numerous posts on your decade-long mediations on the nature of 
arithmetic remain unpersuasive. The consistency of group theory and ring theory 
provide an adequate explanation for all iterative arithmetic operations. Your 
persistence is admirable, your intelligence is substantial, your logic 
questionable and your conclusions lack extension. 


Stan (545:10) Re: [Fis] Ostension and the Chemical / Molecular Biological 
Science,   …It is this translation from material observations into logical 
form, in particular into fully explicit, crisp logical form that I am 
questioning.  Yes, it can lead to short term triumphs, via engineering,…


JLRC: Hummmm, I think you miss the point. The abstract symbol systems of 
Dalton, Lavoisier, and Coulomb underly the foundations of thermodynamics as 
well as the Shannon theory of information as well as our concept of such 
abstractions as “energy” and “entropy.” These symbol systems are now firmly 
embedded in the logic of scientific communications. Perhaps you wish to infer 
that concept of ostension is not useful in the natural sciences?  Or, is it 
that in your world view, “utility” is a bad word? 

BTW, Lavoisier / Daltonian logical forms are not fully explicit in the usual 
sense of mathematics. They are closer to codes with an exact syntax.

Joe (245:11) …”that existence and energy are primitive and numbers something 

JLRC: Are you putting the cart before the horse? As a consequence of the 
international system of units, number takes priority over all other scientific 
and economic proper names. Number is the antecedent to expressing quantity of 
most any sort.

JLRC: Are you attempting to substitute semantics for syntax in your view of 
information theory? In your view of symbolic logic?  In your view of the 
concept of order?

JLRC: The order of the atomic numbers of the chemical elements stands in 
one:one correspondence with the any list of objects, with the listing of 
elements of a group, in a listing of the roots of a polynomial, in a listing of 
a vector, in a listing of the nodes of a graphs, and so forth. The existence of 
a listing is essential to the basic attributes of a message. It is essential to 


Joe (245:11) …and under what conditions one should seek to maximize (because 
valuable) heterogeneity as opposed to homogeneity.


JLRC: What fundamental classes of informational variables can be used to 
express heterogeneity?  Homogeneity? How do such classes relate to Rosen’s 
postulates of separate and distinct symbol systems? Or, Aristotelian causal 
structures?  Or, Descarte’s “clear and distinct” ideas


Gavin (245:12) …one of the qualitative foundations of information theory is 
word frequency of English from Zipfs law.


JLRC: Minor technical point.  Perhaps you mean the frequency of usage of 
different alphabet symbols in a linguistic message?


Stan (245:12) … Put otherwise, does anyone know of data about natural things 
that would not deliver a power law?


JLRC: Power laws are the exception, not the rule in the natural sciences. For 
example, catalysis, the source of nearly all of biological catalysis does not 
follow a power law. In general, chemical structures are not representable by 
power laws. A ‘power law’ is an attempt to deny the role of individual identity 
by asserting a family of exponential relations.  Power laws are extremely 
useful approximations in the social sciences when exactitude is not required. A 
power law is merely an inference based on inductive reasoning about a 
statistical distribution.


Gavin (245:13) …Geometricdynamics???


JLRC: Technical point: I am unaware of any dynamical system that is not 
geometric. What are you seeking to communicate with this term?


Loet (245:15)  …H can also be considered as probabilistic entropy. S is 
relevant in the case that the system of reference is the chemico-physical one 
based on collisions among particles.


JLRC: Technical point: Nearly all components of living systems are optical 
isomers, two or more (exponentially increasing number of isomers with the with 
number of asymmetric centers) exact numerical formula with identical physical 
properties: energy, entropy, heat of formation, free energy, etc, all other 
predicable properties.

 Is you view that information is universally predicable property in the sense 
of general systems theory / symbolic dynamics such that the concept of identity 
no longer applies?


Gavin (245:16) “Knowledge transfer systems are Imperative Logic Systems hereto 
totally uncounted for.”


JLRC: I am unable to interpret your rhetoric. Please explain what you mean by a 
‘knowledge transfer system’ and ‘Imperative logic systems’ and these two 
versions of systems theory are related to the natural sciences. (Ordinarily, 
‘imperative’ logic is associated with necessity and formality.)


John (245:17) Information has to be interpreted. However, information can be 
used to do work with no additional energy input (see some past posts on this 
topic), which suggests a very close connection.

John (245:17) “The entropy budget is made at the expense of information loss in 
these cases.”  “

John(245:17) “So information capacity can emerge (or even be created) by the 
sort of process you mention. It is all physically grounded, though.”


JLRC: Huh? “Information has to be interpreted”?  An interpretation requires an 
interpreter. Man or Machine?

Is this interpretation something other than a code? A symbolic code?

If information is not interpreted as a code, how is it interpreted?

Are you suggesting that “physical grounding” is something other than 
interpretation in terms of physical variables?

Living systems, as material systems over a reproductive cycle, from antecedent 
to consequence, double their entropy content. The perplex information content 
of the two newly independent systems increases by enormous amount, as yet 
uncalculable, amounts. This is the ostension of life, is it not?  This is not 
merely an 'interpretation'. 

Bob (245:17) Most especially, the calculus that is built upon the Shannon 
formula has incredibly wide application. 
JLRC: I disagree that the calculus of Shannon is special. It is very close to 
the power set. From the perspective of molecular biology, the Shannon formula 
has little applicability.  The human capacity to encoded “organization” 
(organic, organelle, organ, relations) into a binary code leads to applications 
of Shannon's calculus. If the natural form of natural information is a binary 
string, then the Shannon formula works just great.  If the natural information 
is not a binary string, then the only method to apply Shannon’s formula is to 
find a way to trans-code the natural information into an encoded form that is 

Bob (245:17) It is useful anywhere constraint enters the picture.

JLRC: Do you wish to infer that the constraints among the atoms of DNA are 
sufficient to apply Shannon’s formula?

Or, must the constraints be imposed through the action of continuous variables?

In the case of your ascendency theory, the encoding scheme was selected in such 
a manner that the exact discrete Daltonian / Lavoisier / Coulombic constraints 
are suppressed from the calculations. Why? I will not offer you any conjectures 
on why ascendency theory is so suppressive. 

Closing Notes:

I have sought to express a number of deeply intertwined and related issues in 
this post. I assume that most of these perspectives will be dismissed on the 
basis of what John assumes is the power of elderly physicists to control our 
thoughts and the ostensions of generating functions. To quote John,

“Sorry, but it has already been done by people like Wheeler, Gell-Mann and 
Hawking. You are not going to win against them. “

Frankly, John, I find this sentence to be beneath you. Where is your spirit of 


My spirit of Peircian inquiry remains in full bloom. The joy of explorations is 
preferable to humble submission at the alter of the dead sciences.  I choose to 
remain, in Karl’s term, “Outspoken”.  The simple fact of the matter is that 
communication from machines to machine is really relatively simple, while life 
is perplexing.

 Could we agree that Human Communication, from body to body or from mind to 
mind, is perplexing?  ;-) 










fis mailing list

Reply via email to