Thanks for the response Jeremy. I appreciate the rigor of your piratic comments 
(perhaps at fis we are more relaxed concocting the arguments; rather than a 
tightly-knit discussion group, fis is more like a caravan of very heterogenous 
"knowledge traders")... Anyhow, concerning the plausibility of approaching the 
prokaryotic, it is an attempt that my mini-group has developed in several works 
(not very successfully attracting the attention of peers; but, helas, that's 
quite another matter!), see: "On prokaryotic Intelligence..." (Marijuan et al., 
BioSystems 99,2,94-103, 2010) which is perhaps the best exemplar. Also del 
Moral et al. in 
Kybernetes<http://juliette.lsi.us.es/Bibliography.aspx?query=%22Kybernetes%22>, 
43(6):846-864, 2014, and several others focused in Mycobacterium tuberculosis. 
Even at the very elementary level we have worked, there seems to be ample room 
for advancing in information-related directions. Perfectly congruent with the 
main theme of this discussion too (reference & significance). Beyond the single 
cell level, there are several information-thresholds of 
communication-thresholds in biological evolution that need to be understood 
better. Once crossed, complexity runaways take place ... (also in our 
societies). I think the story would make better sense if the intertwining of 
self-production flows and communication flows is contemplated differently, 
entering the idea of "absence" too (at least, I have also attempted!). Although 
minimally, we should be able to contribute to the social-informational arena of 
our times. See for instance the great works on "social physics" by Alex 
Pentland (shouldn't it be "social information science"?). But in order to do 
that, we should take care that the microphysical advances in the foundations of 
information can escalate with some parsimony...

These comments have become pretty "tangential" --but often at fis we draw this 
sort of free wheeling tangents!
best ---Pedro

________________________________
De: Jeremy Sherman [mindreadersdiction...@gmail.com]
Enviado el: viernes, 09 de enero de 2015 0:39
Para: PEDRO CLEMENTE MARIJUAN FERNANDEZ
Cc: fis
Asunto: Re: [Fis] Steps to a theory of reference & significance

Hi Pedro,

Jeremy Sherman here, a long-time pirate. Pleased to meet you. You say:

I am also critical with the autogenesis model systems--wouldn't it be far 
clearer approaching a (relatively) simple prokaryotic cell and discuss upon its 
intertwining of the communication and self-production arrangements? The way a 
bacterium "sees" the world, and reorganizes its living, could be a very useful 
analysis. I think it leads to a slightly different outcome regarding 
reference/significance, and meaning/value/fitness.

Terry and the Pirates have a long standing rule: One cannot employ as 
explanation that which hasn't yet been explained. Failing to hold this standard 
opens researchers up to merely taxonomical work, positing forces, properties 
and capacities defined solely by their consequences, in effect mistaking 
questions as answers. Hence, our focus on exploring reference at its earliest 
possible emergence, and explaining exactly how that emergence occurs, since 
emergence is also a question, not an answer, an explanandum not an explanan.

Somewhat related, I recently came across this:

Epistemological particularism is the belief that one can know something without 
knowing how one knows that 
thing.[1]<http://en.wikipedia.org/wiki/Epistemological_particularism#cite_note-1>
 By this understanding, one's knowledge is justified before one knows how such 
belief could be justified. Taking this as a philosophical approach, one would 
ask the question "What do we know?" before asking "How do we know?" The term 
appears in Roderick Chisholm<http://en.wikipedia.org/wiki/Roderick_Chisholm>'s 
"The Problem of the 
Criterion<http://en.wikipedia.org/wiki/Problem_of_the_Criterion>", and in the 
work of his student, Ernest Sosa<http://en.wikipedia.org/wiki/Ernest_Sosa> 
("The Raft and the Pyramid: Coherence versus Foundations in the Theory of 
Knowledge"). Particularism is contrasted with 
Methodism<http://en.wikipedia.org/wiki/Methodism_(philosophy)>, which answers 
the latter question before the former. Since the question "What do we know" 
implies that we know, particularism is considered fundamentally anti-skeptical, 
and was ridiculed by Kant<http://en.wikipedia.org/wiki/Immanuel_Kant> in 
theProlegomena<http://en.wikipedia.org/wiki/Prolegomena>.

We Pirates do what we can to stay on the epistemological methodist side of 
things.

Even the simplest prokaryotic cell is extraordinarily complex. We don't want to 
run before we can walk.  The briskest runners-before-walkers are those who want 
to go straight from physics to human consciousness, a leap that we think makes 
the endeavor thoroughly intractable.

Best,

Jeremy

On Thu, Jan 8, 2015 at 4:48 AM, Pedro C. Marijuan 
<pcmarijuan.i...@aragon.es<mailto:pcmarijuan.i...@aragon.es>> wrote:
Dear Terry and colleagues,

Thanks a lot for the opening text! It is a well crafted Essay full of very 
detailed contents. My impression is that the "microphysics" of information has 
been solved elegantly --at least at the level of today's relevant knowledge-- 
with your work and the works of related authors, one of them Karl Friston, who 
could be linked as a complementary approach to yours (in particular his recent 
"Life as we know it", Royal Society Interface Journal, 10: 20130475). His 
Bayesian approach to life's organization, coupled with (variational) "free 
energy" minimization principle, conduces to the emergence of homeostasis and a 
simple form of autopoiesis, as well as the organization of perception/action 
later on. Thus, quite close to your approach on autogenic systems. About the 
different sections of the Essay, the very detailed points you deal with in 
section 4 ("steps to a formalization of reference")  are, in my opinion, the  
conceptual core and deserve a careful inspection, far more than these rushed 
comments. In any case, the relationship Boltzmann-Shannon entropies has been 
cleared quite elegantly.

However, for my taste the following sections have not sufficiently opened the 
panorama. And with this I start some critical appreciations. Perhaps the 
microphysics of information is not the critical stumbling block to me removed 
for the advancement of the informational perspective. We could remain McLuhan's 
stance on Shannon's information theory and von Neumann's game theory... yes, 
undoubtedly important advancements, but not the essential stuff of information. 
But in this list there are people far more versed in McLuhan's contents and 
whether the caveats he raised would continue to apply (obviously in a different 
way). I am also critical with the autogenesis model systems--wouldn't it be far 
clearer approaching a (relatively) simple prokaryotic cell and discuss upon its 
intertwining of the communication and self-production arrangements? The way a 
bacterium "sees" the world, and reorganizes its living, could be a very useful 
analysis. I think it leads to a slightly different outcome regarding 
reference/significance, and meaning/value/fitness.

If we look at the whole view of the "information world" (human societies, 
behaving individuals, brain organization, cellular processes, biomolecules) and 
how a myriad of information flows are crisscrossing, ascending, descending, 
focusing, mixing and controlling energy flows, etc. we may have an inkling that 
this evanescent world paradoxically becomes the master of the physical world 
(the "fluff" versus the "stuff", Lanham 2006), and that is organized far beyond 
the rules of the micro-macro-physical world. But how? What are the essentials 
of this magnificent "castle in the air" (reminding Escher's engrave: 
http://fis.sciforum.net/ )?

In next exchanges I will try to ad some more specifics on the above "fluffy" 
comments, derided from a fast reading of the Essay. Thanks again, Terry, for 
providing us this discussion opportunity in the New Year.

best  ---Pedro


_*Steps to a theory of reference & significance in information
*_*FIS discussion paper by Terrence W. Deacon (2015)*

This is the link to download the whole paper: 
https://www.dropbox.com/s/v5o8pwx3ggmmmnb/FIS%20Deacon%20on%20information%20v2.pdf?dl=0

/"The mere fact that the same mathematical expression - Σ pi log pi occurs both 
in statistical
mechanics and in information theory does not in itself establish any connection 
between these
fields. This can be done only by finding new viewpoints from which 
thermodynamic entropy and
information-theory entropy appear as the same concept." /(Jaynes 1957, p. 621)

/"What I have tried to do is to turn information theory upside down to make 
what the
engineers call 'redundancy' [coding syntax ] but I call 'pattern' into the 
primary
phenomenon. . . . “/ (Gregory Bateson, letter to John Lilly on his dolphin 
research, 10/05/1968)

*Introduction*

In common use and in its etymology the term ‘information’ has always been 
associated with
concepts of reference and significance―that is to say it is about something for 
some use. But
following the landmark paper by Claude Shannon in 1948 (and later developments 
by Wiener,
Kolmogorov, and others) the technical use of the term became almost entirely 
restricted to refer
to signal properties of a communication medium irrespective of reference or 
use. In the
introduction to this seminal report, Shannon points out that although 
communications often have
meaning, “These semantic aspects of communication are irrelevant to the 
engineering problem”
which is to provide a precise engineering tool to assess the computational and 
physical demands
of the transmission, storage, and encryption of communications in all forms.

The theory provided a way to precisely measure these properties as well as to 
determine
limits on compression, encryption, and error correction. By a sort of metonymic 
shorthand this
quantity (measured in bits) came to be considered synonymous with the meaning of
‘information’ (both in the technical literature and in colloquial use in the IT 
world) but at the cost
of inconsistency with its most distinctive defining attributes.

This definition was, however, consistent with a tacit metaphysical principle 
assumed in the
contemporary natural sciences: the assertion that only material and energetic 
properties can be
assigned causal power and that appeals to teleological explanations are 
illegitimate. This
methodological framework recognizes that teleological explanations merely 
assign a locus of
cause but fail to provide any mechanism, and so they effectively mark a point 
where explanation
ceases. But this stance does not also entail a denial of the reality of 
teleological forms of
causality nor does it require that they can be entirely reduced to intrinsic 
material and energetic
properties.

Reference and significance are both implicitly teleological concepts in the 
sense that they
require an interpretive context (i.e. a point of view) and are not intrinsic to 
any specific physical
substrate (e.g. in the way that mass and charge are). By abstracting the 
technical definition of
information away from these extrinsic properties Shannon provided a concept of 
information that
could be used to measure a formal property that is inherent in all physical 
phenomena: their
organization. Because of its minimalism, this conception of information became 
a precise and
widely applicable analytic tool that has fueled advances in many fields, from 
fundamental
physics to genetics to computation. But this strength has also has undermined 
its usefulness in
fields distinguished by the need to explain the non-intrinsic properties 
associated with
information. This has limited its value for organismal biology where function 
is fundamental, for
the cognitive sciences where representation is a central issue, and for the 
social sciences where
normative assessment seem unavoidable. So this technical redefinition of 
information has been
both a virtue and a limitation.

The central goal of this essay is to demonstrate that the previously set aside 
(and presumed
nonphysical) properties of reference and significance (i.e. normativity) can be 
re-incorporated
into a rigorous formal analysis of information that is suitable for use in both 
the physical (e.g.
quantum theory, cosmology, computation theory) and semiotic sciences (e.g. 
biology, cognitive
science, economics). This analysis will build on Shannon’s formalization of 
information, but will
extend it to explicitly model its link to the statistical and thermodynamic 
properties of its
physical context and to the physical work of interpreting it. It is argued that 
an accurate analysis
of the non-intrinsic attributes that distinguish information from mere physical 
differences is not
only feasible, but necessary to account for its distinctive form of causal 
efficacy.

Initial qualitative and conceptual steps toward this augmentation of 
information theory have
been outlined in a number of recent works (Deacon 2007, 2008, 2010, 2012; 
Deacon &
Koutroufinis, 2012; Deacon , Bacigaluppi & Srivastava, 2014). In these studies 
we hypothesize
that both a determination of reference and a measure of significance or 
functional value can be
formulated in terms of how the extrinsic physical modification of an 
information bearing
medium affects the dynamics of an interpreting system that exhibits 
intrinsically end-directed
and self-preserving properties.

[...]

A model system
To test these principles and their relationship to reference and significance, 
I and my
colleagues have conceived of an empirically realizable and testable thought 
experiment. As in
most efforts to formalize basic physical properties it is useful to begin with 
a simple model
system in which all aspects of the process can be unambiguously represented. 
For our purposes
we describe a theoretical molecular system called an autogen, which maintains 
itself against
degradation by reconstituting damaged components and reconstituting system 
integrity. This
model system involves an empirically realizable molecular complex described 
previously
(Deacon 2012; also in Deacon & Cashman 2012; and also called an autocell in 
Deacon 2006a,
2007; 2009; and Deacon & Sherman 2008).

[...]

In this way we can use formal and simulated versions of autogenesis to develop 
a measure of
relative significance, in the form of “work saved.” I hypothesize that this 
simple model system
exemplifies the most basic dynamical system upon which a formal analysis of 
informational
interpretation and significance can be based.

[...]

In both forms, modifications of the autogenic process is provided with 
information referring
to its own preservation via boundary conditions (external or internal) that are 
predictive of
successful self-preservation. The significance of information of either sort is 
assessed by the
relative minimization of work per work cycle, and therefore the decreased 
uncertainty of selfreferential
constraint preservation. In this way interpretation is analogous to the 
decrease in
uncertainty that is a measure of received information in Shannonian theory, but 
at a teleodynamic
system level.

Using these three variants of a simple model system I claim that we can 
precisely analyze the
relationships between information medium properties, intrinsically end-directed 
work, and the
way these enable system-extrinsic physical conditions to become referential 
information
significant to system ends. These relationships are not only simple enough to 
formalize, but they
can be simulated by computer algorithms at various levels of logical and 
physical detail. I
believe that creating and experimenting with these simulated autogenic systems 
will enable us to
reframe the mysteries of reference and significance as tractable problems, 
susceptible to exact
formal and empirical analysis. This is still a far cry from a theory of 
information that is
sufficiently developed to provide a basis for a scientific semiotic theory much 
less than an
explanation of how human brains interpret information, but it may offer a 
rigorous physical
foundation upon which these more complex theories can be developed.


*― Terry*
Professor Terrence W. Deacon
University of California, Berkeley

-------------------------------------------------
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta X
50009 Zaragoza, Spain
Tfno. +34 976 71 3526<tel:%2B34%20976%2071%203526> (& 6818)
pcmarijuan.i...@aragon.es<mailto:pcmarijuan.i...@aragon.es>
http://sites.google.com/site/pedrocmarijuan/
-------------------------------------------------

_______________________________________________
Fis mailing list
Fis@listas.unizar.es<mailto:Fis@listas.unizar.es>
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to