Dear Loet, Steven, and colleagues,
During last ten years or so, with particular success in most recent
years, Karl Friston has developed his free energy optimization
principle, based on Shannon's information theory and optimal control
theory as well as on the Bayesian brain hypothesis. I think this is the
most advanced work towards a unified brain theory today. The
minimization dynamics of the cerebral free energy construct (it is a
sort of Helmoltz program revisited) becomes a generative process of
perception, action, learning and adaptive behaviors in general. The 2010
paper (Nature Reviews Neurosceince, doi: 10.138/nrn2787) where he
precisely argues about a unified brain theory, is quite representative
of his proposals. On a personal basis, during last two decades I was
following and cooperating with Kenneth Paul Collins (we published a book
in Spanish about the emergence of behavior from brain dynamics). Our
scheme was based on the minimization of a collective variable supposedly
a sort of "entropy" of excitation/inhibition ratios topologically
distributed among neuronal surfaces of the cortex that was performed
essentially by the medial parts of the brain. Although very rich in
qualitative and behavioral aspects, the formal part was too weak
(awfully weak). Until recent years I could not connect meaningfully
Collin's approach with other works, and unfortunately he left scientific
research long ago--but now the marriage with Friston's is remarkable.
Putting them together may be a very fertile exploratory avenue.
best ---Pedro
Loet Leydesdorff wrote:
Dear Steven and colleagues,
I did not (yet) study your approach. Is there a paper that can be read
as an introduction?
It seems to me that one can distinguish between formal and substantial
theories of information. Shannon’s mathematical theory is a formal
apparatus: the design and the results do not yet have meaning without
an interpretation in a substantial context. On the other side, a
theory about, for example, neuro-information is a special theory. One
can in this context use information theory as a statistical tool
(among other tools). Sometimes, one can move beyond description. J
The advantage of information theory, from this perspective of special
theories, is that the formal apparatus allows us sometimes to move
between domains heuristically. For example, a model of the brain can
perhaps be used metaphorically for culture or the economy (or vice
versa). The advantages have to be shown in empirical research: which
questions can be addressed and which puzzles be solved?
Best,
Loet
------------------------------------------------------------------------
Loet Leydesdorff
/Emeritus/ University of Amsterdam
Amsterdam School of Communications Research (ASCoR)
l...@leydesdorff.net <mailto:l...@leydesdorff.net>;
http://www.leydesdorff.net/
Honorary Professor, SPRU, <http://www.sussex.ac.uk/spru/>University of
Sussex;
Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
Hangzhou; Visiting Professor, ISTIC,
<http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
London;
http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
<http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en>
*From:* stevenzen...@gmail.com [mailto:stevenzen...@gmail.com] *On
Behalf Of *Steven Ericsson-Zenith
*Sent:* Tuesday, December 09, 2014 10:13 PM
*To:* l...@leydesdorff.net
*Cc:* Joseph Brenner; fis
*Subject:* Re: [Fis] Information-as-Process
The problem with this approach (and approaches like it) is that it is
descriptive and not explanatory. The distribution of the shape, in my
model, can be described, perhaps, but the process or action decision
point and response covariance is impossible to consider.
It is for this reason that I use holomorphic functors and
hyper-functors in which I can express the explicit role of a base
universal (per gravitation).
Nor is it clear to me that this is what Joe referred to as
"information as process."
On Mon, Dec 8, 2014 at 10:20 PM, Loet Leydesdorff
<l...@leydesdorff.net <mailto:l...@leydesdorff.net>> wrote:
Dear colleagues,
Shannon’s information theory can be considered as a calculus
because it allows for the dynamic extension. Theil
(1972)—Statistical decomposition analysis (North
Holland)—distinguished between static and dynamic information
measures. In addition to Shannon’s statical H, one can write:
mailbox:///C|/Documents%20and%20Settings/pcmarijuan.iacs/Datos%20de%20programa/Thunderbird/Profiles/2vg9i0k9.default/Mail/pop3.aragon-1.es/Inbox?number=1793468636&header=quotebody&part=1.1.2&filename=image001.png
in which
mailbox:///C|/Documents%20and%20Settings/pcmarijuan.iacs/Datos%20de%20programa/Thunderbird/Profiles/2vg9i0k9.default/Mail/pop3.aragon-1.es/Inbox?number=1793468636&header=quotebody&part=1.1.3&filename=image002.pngcan
be considered as the a posteriori and
mailbox:///C|/Documents%20and%20Settings/pcmarijuan.iacs/Datos%20de%20programa/Thunderbird/Profiles/2vg9i0k9.default/Mail/pop3.aragon-1.es/Inbox?number=1793468636&header=quotebody&part=1.1.4&filename=image003.pngthe
a priori distribution. This dynamic information measure can be
decomposed and aggregated. One can also develop measures for
systemic developments and critical transitions. In other words,
information as a process can also be measured in bits of
information. Of course, one can extend the dimensionality (/i/)
for the multivariate case (/ijk/…), and thus use information
theory for network analysis (including time).
Best,
Loet
References:
· Leydesdorff, L. (1991). The Static and Dynamic Analysis
of Network Data Using Information Theory. /Social Networks,
13/(4), 301-345.
· Theil, H. (1972). /Statistical Decomposition Analysis/.
Amsterdam/ London: North-Holland.
------------------------------------------------------------------------
Loet Leydesdorff
/Emeritus/ University of Amsterdam
Amsterdam School of Communications Research (ASCoR)
l...@leydesdorff.net <mailto:l...@leydesdorff.net>;
http://www.leydesdorff.net/
Honorary Professor, SPRU,
<http://www.sussex.ac.uk/spru/>University of Sussex;
Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
Hangzhou; Visiting Professor, ISTIC,
<http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University
of London;
http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
<http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en>
*From:* Fis [mailto:fis-boun...@listas.unizar.es
<mailto:fis-boun...@listas.unizar.es>] *On Behalf Of *Steven
Ericsson-Zenith
*Sent:* Monday, December 08, 2014 10:22 PM
*To:* Joseph Brenner
*Cc:* fis
*Subject:* Re: [Fis] Information-as-Process
I am a little mystified by your assertion of "information as
process." What, exactly, is this and how does it differ fro
information in general (Shannon). Is it related to Whitehead's
process notions?
In terms of neuroscience it is important to move away from
connectionism and modern computational ideas I believe. It is not
clear to me how information theory can be applied to the operation
of the brain at the synaptic level because the actions and the
decisions made are made across the structure and not at a single
location.
Recognition, for example, is not a point event but occurs rather
when a particular shape is formed in the structure (of the CNS,
for example) and is immediately covariant with the "appropriate"
response (another shape) which may be characterized as a
hyper-functor (which may or may not include neurons and astrocytes
in the brain).
Regards,
Steven
--
-------------------------------------------------
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta X
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 (& 6818)
pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-------------------------------------------------
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis