Reminders of old news.

In defense of Stan: The use of the term "variety" as a generic stand-in for
Shannon's concept of signal entropy traces to W. Ross Ashby, in his
excellent effort to demystify information theory and cybernetics for the
nontechnical reader. It is appropriate, then, to assume that use of the
term "variety" is agnostic about the form of a particular reference
distribution being assumed.

About bringing "meaning" into the discussion: As Bob Ulanowicz emphasized
in his paper "Shannon exonerata" from a couple of years ago, Shannon's
analysis implicitly includes two complementary ways of understanding
information: The entropy of a signal channel and the difference or
reduction of entropy of a received message-bearing signal (that which is in
effect "missing" in a received message signal). And these have opposite
signs. This complementarity also indicates the intrinsically relational
nature of the concept of information. What sign (+/-) to assign information
became a controversial issue between Shannon and Wiener, especially since
Wiener wanted to equate information with negentropy. Recognizing this
complementarity and relationality resolves this. Although what Bob calls
the "apophatic" aspect of information can be seen to be linked to reference
and "meaning" these statistical and semiotic properties should not be
confused. As Loet suggests, we would be wise not to slip into a tendency to
equate statistical signal features with meaning. Reference, meaning,
significance, etc. are not intrinsic to a communication medium, but are
defined relative to an interpretive process, the details of which are for
the most part entirely bracketed from the analysis. For these reasons,
although these interpretation-dependent properties are dependent upon
statistical properties of the medium, they cannot be reduced to them
without loss.

— Terry

On Sat, Sep 12, 2015 at 7:48 AM, Stanley N Salthe <ssal...@binghamton.edu>
wrote:

> Reacting to my:
>
> S: Well, I have generalized the Shannon concept of information carrying
> capacity under 'variety'...  {variety {information carrying capacity}}.
> This allows the concept to operate quite generally in evolutionary and
> ecological discourses.  Information, then, if you like, is what is left
> after a reduction in variety, or after some system choice. Consider dance:
> we have all the possible conformations of the human body, out of which a
> few are selected to provide information about the meaning of a dance.
>
> Jerry responded:
>
> Stan's post is a superb example of how anyone change the semantic meaning
> of words and talk about personal philosophy in context that ignores the
> syntactical meaning of the same word such that the exact sciences
> are generated.  Of course, this personal philosophy remains a private
> conversation.
>
> S: I really need a translation of this statement.
>
> STAN
>
> On Fri, Sep 11, 2015 at 11:31 AM, Jerry LR Chandler <
> jerry_lr_chand...@me.com> wrote:
>
>> Dear Steven, Pedro and List:
>>
>> Two excellent posts!
>>
>> Steven:  I look forward to your ratiocinations and there connectivity
>> with symbolic logic.
>>
>> It is my view that one of the foundational stumbling blocks to
>> communication about syntactical information theory (and its exactness!) is
>> the multi-meanings that emerge from the multiple symbol systems used by the
>> natural sciences.
>>
>> Stan's post is a superb example of how anyone change the semantic meaning
>> of words and talk about personal philosophy in context that ignores the
>> syntactical meaning of the same word such that the exact sciences
>> are generated.  Of course, this personal philosophy remains a private
>> conversation.
>>
>>  Steven and Pedro (and I), by way of contrast, are seeking a discussion
>> of public information and the exactness of public information theory.
>>
>> Cheers
>>
>> Jerry
>>
>>
>> Words to live by:
>>
>> *"The union of units unifies the unity of the universe"*
>>
>>
>>
>>
>> On Sep 11, 2015, at 7:22 AM, Pedro C. Marijuan wrote:
>>
>> Dear Steven and FIS colleagues,
>>
>> Many thanks for this opening text. What you are proposing about a pretty
>> structured discussion looks a good idea, although it will have to
>> confront the usually anarchic discussion style of FIS list! Two aspects
>> of your initial text have caught my attention (apart from those videos
>> you recommend that I will watch along the weekend).
>>
>> First about the concerns of a generation earlier (Shannon, Turing...)
>> situating information in the intersection between physical science and
>> engineering. The towering influence of this line of thought, both with
>> positive and negative overtones, cannot be overestimated. Most attempts
>> to enlarge informational thought and to extend it to life, economies,
>> societies, etc. continue to be but a reformulation of the former ideas
>> with little added value. See one of the last creatures: "Why Information
>> Grows: The Evolution of Order, from Atoms to Economies" (2015), by Cesar
>> Hidalgo (prof. at MIT).
>>
>> In my opinion, the extension of those classic ideas to life are very
>> fertile from the technological point of view, from the "theory of
>> molecular machines" for DNA-RNA-protein matching to genomic-proteomic
>> and other omics'  "big data". But all that technobrilliance does not
>> open per se new avenues in order to produce innovative thought about the
>> information stuff of human societies. Alternatively we may think that
>> the accelerated digitalization of our world and the cyborg-symbiosis of
>> human information and computer information do not demand much brain
>> teasing, as it is a matter that social evolution is superseding by itself.
>>
>> The point I have ocasionally raised in this list is whether all the new
>> molecular knowledge about life might teach us about a fundamental
>> difference in the "way of being in the world" between life and inert
>> matter (& mechanism & computation)---or not. In the recent compilation
>> by Plamen and colleagues from the former INBIOSA initiative,  I have
>> argued about that fundamental difference in the intertwining of
>> communication/self-production, how signaling is strictly caught in the
>> advancement of a life cycle  (see paper "How the living is in the
>> world"). Life is based on an inusitate informational formula unknown in
>> inert matter. And the very organization of life provides an original
>> starting point to think anew about information --of course, not the only
>> one.
>>
>> So, to conclude this "tangent", I find quite exciting the discussion we
>> are starting now, say from the classical info positions onwards, in
>> particularly to be compared in some future with another session (in
>> preparation) with similar ambition but starting from say the
>> phenomenology of the living. Struggling for a
>> convergence/complementarity of outcomes would be a cavalier effort.
>>
>> All the best--Pedro
>>
>>
>>
>> Steven Ericsson-Zenith wrote:
>>
>> ...The subject is one that has concerned me ever since I completed my PhD
>> in 1992. I came away from defending my thesis, essentially on large scale
>> parallel computation, with the strong intuition that I had disclosed much
>> more concerning the little that we know, than I had offered either a
>> theoretical or engineering solution.
>>
>> For the curious, a digital copy of this thesis can be found among the
>> reports of CRI, MINES ParisTech, formerly ENSMP,
>> http://www.cri.ensmp.fr/classement/doc/A-232.pdf, it is also available
>> as a paper copy on Amazon.
>>
>>
>> Like many that have been involved in microprocessor and instruction
>> set/language design, using mathematical methods, we share the physical
>> concerns of a generation earlier, people like John Von Neumann, Alan
>> Turing, and Claude Shannon. In other words, a close intersection between
>> physical science and machine engineering.
>>
>>
>> ...I will then discuss some historical issues in particular referencing
>> Benjamin Peirce, Albert Einstein and Alan Turing. And finally discuss the
>> contemporary issues, as I see them, in biophysics, biology, and associated
>> disciplines, reaching into human and other social constructions, perhaps
>> touching on cosmology and the extended role of information theory in
>> mathematical physics...
>>
>>
>>
>> _______________________________________________
>>
>> Fis mailing list
>>
>> Fis@listas.unizar.es
>>
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>>
>>
>>
>>
>> --
>> -------------------------------------------------
>> Pedro C. Marijuán
>> Grupo de Bioinformación / Bioinformation Group
>> Instituto Aragonés de Ciencias de la Salud
>> Centro de Investigación Biomédica de Aragón (CIBA)
>> Avda. San Juan Bosco, 13, planta X
>> 50009 Zaragoza, Spain
>> Tfno. +34 976 71 3526 (& 6818)
>> pcmarijuan.i...@aragon.es
>> http://sites.google.com/site/pedrocmarijuan/
>> -------------------------------------------------
>>
>>
>> _______________________________________________
>> Fis mailing list
>> Fis@listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>>
>>
>> _______________________________________________
>> Fis mailing list
>> Fis@listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>>
>
> _______________________________________________
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>


-- 
Professor Terrence W. Deacon
University of California, Berkeley
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to