Bravo Arturo - I totally agree - in a paper I co-authored with Stuart Kauffman 
and others we talked abut the relativity of 
information and the fact that information is not an absolute. Here is the 
abstract of the paper and an excerpt from the paper that discusses the 
relativity of information. The full papers available at: 
https://www.academia.edu/783503/Propagating_organization_an_enquiry

Best wishes - Bob Logan

Kauffman, Stuart, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
Ilya Smulevich. 2007. Propagating Organization: An Inquiry. Biology and 
Philosophy 23: 27-45.

                                    Propagating Organization: An Enquiry - 
Stuart Kauffman, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
lIlya Shmulevich

Institute for Systems Biology, Seattle Washington

 Abstract: Our aim in this article is to attempt to discuss propagating 
organization of process, a poorly articulated union of matter, energy, work, 
constraints and that vexed concept, “information”, which unite in far from 
equilibrium living physical systems. Our hope is to stimulate discussions by 
philosophers of biology and biologists to further clarify the concepts we 
discuss here. We place our discussion in the broad context of a “general 
biology”, properties that might well be found in life anywhere in the cosmos, 
freed from the specific examples of terrestrial life after 3.8 billion years of 
evolution. By placing the discussion in this wider, if still hypothetical, 
context, we also try to place in context some of the extant discussion of 
information as intimately related to DNA, RNA and protein transcription and 
translation processes. While characteristic of current terrestrial life, there 
are no compelling grounds to suppose the same mechanisms would be involved in 
any life form able to evolve by heritable variation and natural selection. In 
turn, this allows us to discuss at least briefly, the focus of much of the 
philosophy of biology on population genetics, which, of course, assumes DNA, 
RNA, proteins, and other features of terrestrial life. Presumably, evolution by 
natural selection – and perhaps self-organization - could occur on many worlds 
via different causal mechanisms.

Here we seek a non-reductionist explanation for the synthesis, accumulation, 
and propagation of information, work, and constraint, which we hope will 
provide some insight into both the biotic and abiotic universe, in terms of 
both molecular self reproduction and the basic work energy cycle where work is 
the constrained release of energy into a few degrees of freedom. The typical 
requirement for work itself is to construct those very constraints on the 
release of energy that then constitute further work. Information creation, we 
argue, arises in two ways: first information as natural selection assembling 
the very constraints on the release of energy that then constitutes work and 
the propagation of organization. Second, information in a more extended sense 
is “semiotic”, that is about the world or internal state of the organism and 
requires appropriate response. The idea is to combine ideas from biology, 
physics, and computer science, to formulate explanatory hypotheses on how 
information can be captured and rendered in the expected physical 
manifestation, which can then participate in the propagation of the 
organization of process in the expected biological work cycles to create the 
diversity in our observable biosphere.

Our conclusions, to date, of this enquiry suggest a foundation which views 
information as the construction of constraints, which, in their physical 
manifestation, partially underlie the processes of evolution to dynamically 
determine the fitness of organisms within the context of a biotic universe.


Section 4. The Relativity of Information

 In Sections 2 we have argued that the Shannon conception of information are 
not directly suited to describe the information of autonomous agents that 
propagate their organization. In Section 3 we have defined a new form of 
information, instructional or biotic information as the constraints that direct 
the flow of free energy to do work.

The reader may legitimately ask the question “isn’t information just 
information?”, i.e., an invariant like the speed of light. Our response to this 
question is no, and to then clarify what seems arbitrary about the definition 
of information. Instructional or biotic information is a useful definition for 
biotic systems just as Shannon information was useful for telecommunication 
channel engineering, and Kolmogorov (Shiryayev 1993) information was useful for 
the study of information compression with respect to Turing machines.

The definition of information is relative and depends on the context in which 
it is to be considered. There appears to be no such thing as absolute 
information that is an invariant that applies to all circumstances. Just as 
Shannon defined information in such a way as to understand the engineering of 
telecommunication channels, our definition of instructional or biotic 
information best describes the interaction and evolution of biological systems 
and the propagation of organization. Information is a tool and as such it comes 
in different forms. We therefore would like to suggest that information is not 
an invariant but rather a quantity that is relative to the environment in which 
it operates. It is also the case that the information in a system or structure 
is not an intrinsic property of that system or structure; rather it is 
sensitive to history and environment. To drive home this point we will now 
examine the historic context in which Shannon (1948) information emerged.

Before delving into the origin of Shannon information we will first examine the 
relationship of information and materiality. Information is about material 
things and furthermore is instantiated in material things but is not material 
itself. Information is an abstraction we use to describe the behavior of 
material things and often is thought as something that controls, in the 
cybernetic sense, material things. So what do we mean when we say the 
constraints are information and information is constraints as we did in Section 
3.

“The constraints are information” is a way to describe the limits on the 
behavior of an autonomous agent who acts on its own behalf but is nevertheless 
constrained by the internal logic that allows it to propagate its organization. 
This is consistent with Hayle’s (1999, p. 72) description of the way 
information is regarded by information science: “It constructs information as 
the site of mastery and control over the material world.” She claims, and we 
concur, that information science treats information as separate from the 
material base in which it is instantiated. This suggests that there is nothing 
intrinsic about information but rather it is merely a description of or a 
metaphor for the complex patterns of behavior of material things. In fact, the 
key is to what degree information is a completely vivid description of the 
objects in question.

This understanding of the nature of information arises from Shannon’s (1948) 
original formulation of information, dating back to his original paper:

The fundamental problem of communication is that of reproducing at one point 
either exactly or approximately a message selected at another point. Frequently 
the messages have meaning; that is they refer to or are correlated according to 
some system with certain physical or conceptual entities. These semantic 
aspects of communication are irrelevant to the engineering problem. The 
significant aspect is that the actual message is one selected from a set of 
possible messages. The system must be designed to operate for each possible 
selection, not just the one that will actually be chosen since this is unknown 
at the time of design. If the number of messages in the set is finite then this 
number or any monotonic function of this number can be regarded as a measure of 
the information produced when one message is chosen from the set, all choices 
being equally likely.
 A number of problems for biology emerge from this view of information. The 
first is that the number of possible messages is not finite because we are not 
able to prestate all possible preadaptations from which a particular message 
can be selected and therefore the Shannon measure breaks down. Another problem 
is that for Shannon the semantics or meaning of the message does not matter, 
whereas in biology the opposite is true. Biotic agents have purpose and hence 
meaning.

The third problem is that Shannon information is defined independent of the 
medium of its instantiation. This independence of the medium is at the heart of 
a strong AI approach in which it is claimed that human intelligence does not 
require a wet computer, the brain, to operate but can be instantiated onto a 
silicon-based computer. In the biosphere, however, one cannot separate the 
information from the material in which it is instantiated. The DNA is not a 
sign for something else it is the actual thing in itself, which regulates other 
genes, generates messenger RNA, which in turn control the production of 
proteins. Information on a computer or a telecommunication device can slide 
from one computer or device to another and then via a printer to paper and not 
really change, McLuhan’s “the medium is the message” aside. This is not true of 
living things. The same genotype does not always produce the same phenotype.

According to the Shannon definition of information, a structured set of numbers 
like the set of even numbers has less information than a set of random numbers 
because one can predict the sequence of even numbers. By this argument, a 
random soup of organic chemicals has more information that a structured biotic 
agent. The biotic agent has more meaning than the soup, however. The living 
organism with more structure and more organization has less Shannon 
information. This is counterintuitive to a biologist’s understanding of a 
living organism. We therefore conclude that the use of Shannon information to 
describe a biotic system would not be valid. Shannon information for a biotic 
system is simply a category error.

A living organism has meaning because it is an autonomous agent acting on its 
own behalf. A random soup of organic chemicals has no meaning and no 
organization. We may therefore conclude that a central feature of life is 
organization—organization that propagates.


______________________

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications










On Dec 11, 2016, at 10:57 AM, tozziart...@libero.it wrote:



Dear FISers,

I know that some of you are going to kill me, but there’s something that I must 
confess. 

I notice, from the nice issued raised by Francesco Rizzo, Joseph Brenner, John 
Collier, that the main concerns are always energetic/informational arguments 
and accounts.

Indeed, the current tenets state that all is information, information being a 
real quantity that can be measured through informational entropies.

But… I ask to myself, is such a tenet true?

When I cook the pasta, I realize that, by my point of view, the cooked pasta 
encompasses more information than the not-cooked one, because it acquires the 
role of something that I can eat in order to increase my possibility to 
preserve myself in the hostile environment that wants to destroy me.  However, 
by the point of view of the bug who eats the non-cooked pasta, my cooked pasta 
displays less information for sure.  Therefore, information is a very 
subjective measure that, apart from its relationship with the observer, does 
not mean very much…  Who can state that an event or a fact displays more 
information than another one?

And, please, do not counteract that information is a quantifiable, objective 
reality, because it can be measured through informational entropy… 
Informational entropy, in its original Shannon’s formulation, stands for an 
ergodic process (page 8 of the original 1948 Shannon’s seminal paper), i.e.: 
every sequence produced by the processes is the same in statistical properties, 
or, in other words, a traveling particle always crosses all the points of its 
phase space.  However, in physics and biology, the facts and events are never 
ergodic.  Statistical homogeneity is just a fiction, if we evaluate the world 
around us and our brain/mind. 

Therefore, the role of information could not be as fundamental as currently 
believed.   

 

P.S.: topology analyzes information by another point of view, but it’s an issue 
for the next time, I think…

 

 


Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/ <http://arturotozzi.webnode.it/> 


_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to