[Fis] FIS NEW YEAR LECTURE

2015-01-07 Thread Pedro C. Marijuan

Dear FIS Colleagues,

Following our recently established tradition, for the 2015 FIS New Year 
Lecture we will count with:

*
_Terrence W. Deacon_*
Professor of Anthropology
Cognitive Science Faculty
University of California at Berkeley

He will impart the Lecture:

_*Steps to a Theory of Reference  Significance in Information*_

Members of the thought collective The Pirates are also invited to the 
discussion.
The theme will be developed in an Essay attached to my next message. In 
order to facilitate discussion I will copy within the message's body the 
Introduction and some of the concluding comments of the Essay. Let us 
anticipate a really exciting discussion!


Best wishes to all, FISers and Pirates, and Happy New Year!

--Pedro

-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta X
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 ( 6818)
pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Steps to a theory of reference significance

2015-01-07 Thread Pedro C. Marijuan


_*Steps to a theory of reference  significance in information
*_*FIS discussion paper by Terrence W. Deacon (2015)*

This is the link to download the whole paper: 
https://www.dropbox.com/s/v5o8pwx3ggmmmnb/FIS%20Deacon%20on%20information%20v2.pdf?dl=0


/The mere fact that the same mathematical expression - Σ pi log pi 
occurs both in statistical
mechanics and in information theory does not in itself establish any 
connection between these
fields. This can be done only by finding new viewpoints from which 
thermodynamic entropy and
information-theory entropy appear as the same concept. /(Jaynes 1957, 
p. 621)


/What I have tried to do is to turn information theory upside down to 
make what the
engineers call 'redundancy' [coding syntax ] but I call 'pattern' into 
the primary
phenomenon. . . . “/ (Gregory Bateson, letter to John Lilly on his 
dolphin research, 10/05/1968)


*Introduction*
In common use and in its etymology the term ‘information’ has always 
been associated with
concepts of reference and significance—that is to say it is about 
something for some use. But
following the landmark paper by Claude Shannon in 1948 (and later 
developments by Wiener,
Kolmogorov, and others) the technical use of the term became almost 
entirely restricted to refer
to signal properties of a communication medium irrespective of reference 
or use. In the
introduction to this seminal report, Shannon points out that although 
communications often have
meaning, “These semantic aspects of communication are irrelevant to the 
engineering problem”
which is to provide a precise engineering tool to assess the 
computational and physical demands

of the transmission, storage, and encryption of communications in all forms.

The theory provided a way to precisely measure these properties as well 
as to determine
limits on compression, encryption, and error correction. By a sort of 
metonymic shorthand this
quantity (measured in bits) came to be considered synonymous with the 
meaning of
‘information’ (both in the technical literature and in colloquial use in 
the IT world) but at the cost

of inconsistency with its most distinctive defining attributes.

This definition was, however, consistent with a tacit metaphysical 
principle assumed in the
contemporary natural sciences: the assertion that only material and 
energetic properties can be
assigned causal power and that appeals to teleological explanations are 
illegitimate. This
methodological framework recognizes that teleological explanations 
merely assign a locus of
cause but fail to provide any mechanism, and so they effectively mark a 
point where explanation
ceases. But this stance does not also entail a denial of the reality of 
teleological forms of
causality nor does it require that they can be entirely reduced to 
intrinsic material and energetic

properties.

Reference and significance are both implicitly teleological concepts in 
the sense that they
require an interpretive context (i.e. a point of view) and are not 
intrinsic to any specific physical
substrate (e.g. in the way that mass and charge are). By abstracting the 
technical definition of
information away from these extrinsic properties Shannon provided a 
concept of information that
could be used to measure a formal property that is inherent in all 
physical phenomena: their
organization. Because of its minimalism, this conception of information 
became a precise and
widely applicable analytic tool that has fueled advances in many fields, 
from fundamental
physics to genetics to computation. But this strength has also has 
undermined its usefulness in
fields distinguished by the need to explain the non-intrinsic properties 
associated with
information. This has limited its value for organismal biology where 
function is fundamental, for
the cognitive sciences where representation is a central issue, and for 
the social sciences where
normative assessment seem unavoidable. So this technical redefinition of 
information has been

both a virtue and a limitation.

The central goal of this essay is to demonstrate that the previously set 
aside (and presumed
nonphysical) properties of reference and significance (i.e. normativity) 
can be re-incorporated
into a rigorous formal analysis of information that is suitable for use 
in both the physical (e.g.
quantum theory, cosmology, computation theory) and semiotic sciences 
(e.g. biology, cognitive
science, economics). This analysis will build on Shannon’s formalization 
of information, but will
extend it to explicitly model its link to the statistical and 
thermodynamic properties of its
physical context and to the physical work of interpreting it. It is 
argued that an accurate analysis
of the non-intrinsic attributes that distinguish information from mere 
physical differences is not
only feasible, but necessary to account for its distinctive form of 
causal efficacy.


Initial qualitative and conceptual steps toward this augmentation of 
information theory