Re: [Fis] Information and Locality Introduction

2015-09-16 Thread Moisés André Nisenbaum
Steven
Just to register.
I watched all Khan Academy videos.
They are awesome!
I will use them with my students.
Thank you.

Best

-- 
Moisés André Nisenbaum
Doutorando IBICT/UFRJ. Professor. Msc.
Instituto Federal do Rio de Janeiro - IFRJ
Campus Maracanã
moises.nisenb...@ifrj.edu.br


2015-09-09 19:13 GMT-03:00 Steven Ericsson-Zenith :

> Dear List,
>
> This is the start of the next FIS discussion. And this is the first of
> several emails kicking the discussion off and divided into logical parts so
> as not to confront the reader with too many ideas and too much text at once.
>
> The subject is one that has concerned me ever since I completed my PhD in
> 1992. I came away from defending my thesis, essentially on large scale
> parallel computation, with the strong intuition that I had disclosed much
> more concerning the little that we know, than I had offered either a
> theoretical or engineering solution.
>
> For the curious, a digital copy of this thesis can be found among the
> reports of CRI, MINES ParisTech, formerly ENSMP,
> http://www.cri.ensmp.fr/classement/doc/A-232.pdf, it is also available as
> a paper copy on Amazon.
>
> Like many that have been involved in microprocessor and instruction
> set/language design, using mathematical methods, we share the physical
> concerns of a generation earlier, people like John Von Neumann, Alan
> Turing, and Claude Shannon. In other words, a close intersection between
> physical science and machine engineering.
>
> So I wish to proceed as follows, especially since this is a cross
> disciplinary group:
>
> First identify a statement of the domain, what is it that I, in
> particular, speak of when we use the term “Information.” I will clarify as
> necessary. I will then discuss the issue of locality, what I think that
> issue is and why it is a problem. Here we will get into several topics of
> classical discussion. I will briefly present my own mathematics for the
> problem in an informal yet rigorous style, reaching into the foundations of
> logic.
>
> I will then discuss some historical issues in particular referencing
> Benjamin Peirce, Albert Einstein and Alan Turing. And finally discuss the
> contemporary issues, as I see them, in biophysics, biology, and associated
> disciplines, reaching into human and other social constructions, perhaps
> touching on cosmology and the extended role of information theory in
> mathematical physics.
>
> This will seem very broad but in all cases I will focus upon the issues of
> locality they each present.
>
> Before my preparations for these discussions I surveyed existing
> pedagogical work to see how our science is currently presented and I came
> across the Khan Academy video series on Information Theory, authored by
> Brit Cruise.
>
> As flawed as I find this work, it is none-the-less an adequate place for
> us to start and to build upon. It does a good job in briefly presenting the
> work of Claude Shannon and others, in its second part on Modern Information
> Theory.
>
> I especially encourage advanced readers to take the few minutes it will
> take to review the Origin of Markov Chains, A Mathematical Theory of
> Communication, Information Entropy, Compression Codes and Error Correction
> to set the field and ensure that we are on the same page. You may also find
> the final video on SETI work interesting, it will be relevant as we proceed.
>
> You can review these short videos on YouTube and here:
>
> https://www.khanacademy.org/computing/computer-science/informationtheory
>
> or here:
>
> https://www.youtube.com/playlist?list=PLbg3ZX2pWlgKDVFNwn9B63UhYJVIerzHL
>
> I invite you to review these videos as the context for my next posting
> that will be a discussion of what is good about this model, locality, and
> what is, I now argue, fundamentally missing or wrong headed.
>
> Pedro, at the end of this I will aggregate these parts for the FIS wiki.
>
> Regards,
> Steven
>
> --
> Dr. Steven Ericsson-Zenith, Los Gatos, California. +1-650-308-8611
> http://iase.info
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Information and Locality Introduction

2015-09-12 Thread Terrence W. DEACON
Reminders of old news.

In defense of Stan: The use of the term "variety" as a generic stand-in for
Shannon's concept of signal entropy traces to W. Ross Ashby, in his
excellent effort to demystify information theory and cybernetics for the
nontechnical reader. It is appropriate, then, to assume that use of the
term "variety" is agnostic about the form of a particular reference
distribution being assumed.

About bringing "meaning" into the discussion: As Bob Ulanowicz emphasized
in his paper "Shannon exonerata" from a couple of years ago, Shannon's
analysis implicitly includes two complementary ways of understanding
information: The entropy of a signal channel and the difference or
reduction of entropy of a received message-bearing signal (that which is in
effect "missing" in a received message signal). And these have opposite
signs. This complementarity also indicates the intrinsically relational
nature of the concept of information. What sign (+/-) to assign information
became a controversial issue between Shannon and Wiener, especially since
Wiener wanted to equate information with negentropy. Recognizing this
complementarity and relationality resolves this. Although what Bob calls
the "apophatic" aspect of information can be seen to be linked to reference
and "meaning" these statistical and semiotic properties should not be
confused. As Loet suggests, we would be wise not to slip into a tendency to
equate statistical signal features with meaning. Reference, meaning,
significance, etc. are not intrinsic to a communication medium, but are
defined relative to an interpretive process, the details of which are for
the most part entirely bracketed from the analysis. For these reasons,
although these interpretation-dependent properties are dependent upon
statistical properties of the medium, they cannot be reduced to them
without loss.

— Terry

On Sat, Sep 12, 2015 at 7:48 AM, Stanley N Salthe 
wrote:

> Reacting to my:
>
> S: Well, I have generalized the Shannon concept of information carrying
> capacity under 'variety'...  {variety {information carrying capacity}}.
> This allows the concept to operate quite generally in evolutionary and
> ecological discourses.  Information, then, if you like, is what is left
> after a reduction in variety, or after some system choice. Consider dance:
> we have all the possible conformations of the human body, out of which a
> few are selected to provide information about the meaning of a dance.
>
> Jerry responded:
>
> Stan's post is a superb example of how anyone change the semantic meaning
> of words and talk about personal philosophy in context that ignores the
> syntactical meaning of the same word such that the exact sciences
> are generated.  Of course, this personal philosophy remains a private
> conversation.
>
> S: I really need a translation of this statement.
>
> STAN
>
> On Fri, Sep 11, 2015 at 11:31 AM, Jerry LR Chandler <
> jerry_lr_chand...@me.com> wrote:
>
>> Dear Steven, Pedro and List:
>>
>> Two excellent posts!
>>
>> Steven:  I look forward to your ratiocinations and there connectivity
>> with symbolic logic.
>>
>> It is my view that one of the foundational stumbling blocks to
>> communication about syntactical information theory (and its exactness!) is
>> the multi-meanings that emerge from the multiple symbol systems used by the
>> natural sciences.
>>
>> Stan's post is a superb example of how anyone change the semantic meaning
>> of words and talk about personal philosophy in context that ignores the
>> syntactical meaning of the same word such that the exact sciences
>> are generated.  Of course, this personal philosophy remains a private
>> conversation.
>>
>>  Steven and Pedro (and I), by way of contrast, are seeking a discussion
>> of public information and the exactness of public information theory.
>>
>> Cheers
>>
>> Jerry
>>
>>
>> Words to live by:
>>
>> *"The union of units unifies the unity of the universe"*
>>
>>
>>
>>
>> On Sep 11, 2015, at 7:22 AM, Pedro C. Marijuan wrote:
>>
>> Dear Steven and FIS colleagues,
>>
>> Many thanks for this opening text. What you are proposing about a pretty
>> structured discussion looks a good idea, although it will have to
>> confront the usually anarchic discussion style of FIS list! Two aspects
>> of your initial text have caught my attention (apart from those videos
>> you recommend that I will watch along the weekend).
>>
>> First about the concerns of a generation earlier (Shannon, Turing...)
>> situating information in the intersection between physical science and
>> engineering. The towering influence of this line of thought, both with
>> positive and negative overtones, cannot be overestimated. Most attempts
>> to enlarge informational thought and to extend it to life, economies,
>> societies, etc. continue to be but a reformulation of the former ideas
>> with little added value. See one of the last creatures: "Why Information
>> Grows: The Evolution of Order, from Atoms to 

Re: [Fis] Information and Locality Introduction

2015-09-11 Thread Robert E. Ulanowicz
I'll have to weigh in with Stan on this one. Stan earlier had defined
information more generally as "constraint". It is convenient to employ the
IT calculus to separate constraint from indeterminacy. This is possible in
complete abstraction from anything to do with communication.

The ability to make this separation has wide-ranging consequences. For
example, it provides a pathway by which process philosophy can be brought
to bear on quantitative physical systems! It is no longer necessary to
rely solely on positivist "objects moving according to law". That's no
small advance!



The best,
Bob

> Pedro wrote"
>
>>Most attempts to enlarge informational thought and to extend it to life,
> economies, societies, etc. continue to be but a reformulation of the
> former
> ideas with little added value.
>
> S: Well, I have generalized the Shannon concept of information carrying
> capacity under 'variety'...  {variety {information carrying capacity}}.
> This allows the concept to operate quite generally in evolutionary and
> ecological discourses.  Information, then, if you like, is what is left
> after a reduction in variety, or after some system choice.  Consider
> dance:
> we have all the possible conformations of the human body, out of which a
> few are selected to provide information about the meaning of a dance.
>
> STAN
>
> STAN
>
> On Fri, Sep 11, 2015 at 8:22 AM, Pedro C. Marijuan <
> pcmarijuan.i...@aragon.es> wrote:
>
>> Dear Steven and FIS colleagues,
>>
>> Many thanks for this opening text. What you are proposing about a pretty
>> structured discussion looks a good idea, although it will have to
>> confront the usually anarchic discussion style of FIS list! Two aspects
>> of your initial text have caught my attention (apart from those videos
>> you recommend that I will watch along the weekend).
>>
>> First about the concerns of a generation earlier (Shannon, Turing...)
>> situating information in the intersection between physical science and
>> engineering. The towering influence of this line of thought, both with
>> positive and negative overtones, cannot be overestimated. Most attempts
>> to enlarge informational thought and to extend it to life, economies,
>> societies, etc. continue to be but a reformulation of the former ideas
>> with little added value. See one of the last creatures: "Why Information
>> Grows: The Evolution of Order, from Atoms to Economies" (2015), by Cesar
>> Hidalgo (prof. at MIT).
>>
>> In my opinion, the extension of those classic ideas to life are very
>> fertile from the technological point of view, from the "theory of
>> molecular machines" for DNA-RNA-protein matching to genomic-proteomic
>> and other omics'  "big data". But all that technobrilliance does not
>> open per se new avenues in order to produce innovative thought about the
>> information stuff of human societies. Alternatively we may think that
>> the accelerated digitalization of our world and the cyborg-symbiosis of
>> human information and computer information do not demand much brain
>> teasing, as it is a matter that social evolution is superseding by
>> itself.
>>
>> The point I have ocasionally raised in this list is whether all the new
>> molecular knowledge about life might teach us about a fundamental
>> difference in the "way of being in the world" between life and inert
>> matter (& mechanism & computation)---or not. In the recent compilation
>> by Plamen and colleagues from the former INBIOSA initiative,  I have
>> argued about that fundamental difference in the intertwining of
>> communication/self-production, how signaling is strictly caught in the
>> advancement of a life cycle  (see paper "How the living is in the
>> world"). Life is based on an inusitate informational formula unknown in
>> inert matter. And the very organization of life provides an original
>> starting point to think anew about information --of course, not the only
>> one.
>>
>> So, to conclude this "tangent", I find quite exciting the discussion we
>> are starting now, say from the classical info positions onwards, in
>> particularly to be compared in some future with another session (in
>> preparation) with similar ambition but starting from say the
>> phenomenology of the living. Struggling for a
>> convergence/complementarity of outcomes would be a cavalier effort.
>>
>> All the best--Pedro
>>
>>
>>
>> Steven Ericsson-Zenith wrote:
>>
>>> ...The subject is one that has concerned me ever since I completed my
>>> PhD
>>> in 1992. I came away from defending my thesis, essentially on large
>>> scale
>>> parallel computation, with the strong intuition that I had disclosed
>>> much
>>> more concerning the little that we know, than I had offered either a
>>> theoretical or engineering solution.
>>> For the curious, a digital copy of this thesis can be found among the
>>> reports of CRI, MINES ParisTech, formerly ENSMP,
>>> 

Re: [Fis] Information and Locality Introduction

2015-09-11 Thread Stanley N Salthe
Pedro wrote"

>Most attempts to enlarge informational thought and to extend it to life,
economies, societies, etc. continue to be but a reformulation of the former
ideas with little added value.

S: Well, I have generalized the Shannon concept of information carrying
capacity under 'variety'...  {variety {information carrying capacity}}.
This allows the concept to operate quite generally in evolutionary and
ecological discourses.  Information, then, if you like, is what is left
after a reduction in variety, or after some system choice.  Consider dance:
we have all the possible conformations of the human body, out of which a
few are selected to provide information about the meaning of a dance.

STAN

STAN

On Fri, Sep 11, 2015 at 8:22 AM, Pedro C. Marijuan <
pcmarijuan.i...@aragon.es> wrote:

> Dear Steven and FIS colleagues,
>
> Many thanks for this opening text. What you are proposing about a pretty
> structured discussion looks a good idea, although it will have to
> confront the usually anarchic discussion style of FIS list! Two aspects
> of your initial text have caught my attention (apart from those videos
> you recommend that I will watch along the weekend).
>
> First about the concerns of a generation earlier (Shannon, Turing...)
> situating information in the intersection between physical science and
> engineering. The towering influence of this line of thought, both with
> positive and negative overtones, cannot be overestimated. Most attempts
> to enlarge informational thought and to extend it to life, economies,
> societies, etc. continue to be but a reformulation of the former ideas
> with little added value. See one of the last creatures: "Why Information
> Grows: The Evolution of Order, from Atoms to Economies" (2015), by Cesar
> Hidalgo (prof. at MIT).
>
> In my opinion, the extension of those classic ideas to life are very
> fertile from the technological point of view, from the "theory of
> molecular machines" for DNA-RNA-protein matching to genomic-proteomic
> and other omics'  "big data". But all that technobrilliance does not
> open per se new avenues in order to produce innovative thought about the
> information stuff of human societies. Alternatively we may think that
> the accelerated digitalization of our world and the cyborg-symbiosis of
> human information and computer information do not demand much brain
> teasing, as it is a matter that social evolution is superseding by itself.
>
> The point I have ocasionally raised in this list is whether all the new
> molecular knowledge about life might teach us about a fundamental
> difference in the "way of being in the world" between life and inert
> matter (& mechanism & computation)---or not. In the recent compilation
> by Plamen and colleagues from the former INBIOSA initiative,  I have
> argued about that fundamental difference in the intertwining of
> communication/self-production, how signaling is strictly caught in the
> advancement of a life cycle  (see paper "How the living is in the
> world"). Life is based on an inusitate informational formula unknown in
> inert matter. And the very organization of life provides an original
> starting point to think anew about information --of course, not the only
> one.
>
> So, to conclude this "tangent", I find quite exciting the discussion we
> are starting now, say from the classical info positions onwards, in
> particularly to be compared in some future with another session (in
> preparation) with similar ambition but starting from say the
> phenomenology of the living. Struggling for a
> convergence/complementarity of outcomes would be a cavalier effort.
>
> All the best--Pedro
>
>
>
> Steven Ericsson-Zenith wrote:
>
>> ...The subject is one that has concerned me ever since I completed my PhD
>> in 1992. I came away from defending my thesis, essentially on large scale
>> parallel computation, with the strong intuition that I had disclosed much
>> more concerning the little that we know, than I had offered either a
>> theoretical or engineering solution.
>> For the curious, a digital copy of this thesis can be found among the
>> reports of CRI, MINES ParisTech, formerly ENSMP,
>> http://www.cri.ensmp.fr/classement/doc/A-232.pdf, it is also available
>> as a paper copy on Amazon.
>>
>> Like many that have been involved in microprocessor and instruction
>> set/language design, using mathematical methods, we share the physical
>> concerns of a generation earlier, people like John Von Neumann, Alan
>> Turing, and Claude Shannon. In other words, a close intersection between
>> physical science and machine engineering.
>>
>> ...I will then discuss some historical issues in particular referencing
>> Benjamin Peirce, Albert Einstein and Alan Turing. And finally discuss the
>> contemporary issues, as I see them, in biophysics, biology, and associated
>> disciplines, reaching into human and other social constructions, perhaps
>> touching on cosmology and the extended role of information theory in

Re: [Fis] Information and Locality Introduction

2015-09-11 Thread Francesco Rizzo
Cari Steven Ericsson-Zenith, Pedro e Tutti,
proprio da qualche mese ho iniziato a scrivere un libro in cui, fra
l'altro, affronto ancora una volta il tema dell'informazione dell'armonia o
dell'armonia dell'informazione. Specialmente in questi giorni, dopo avere
passato in rassegna le diverse teorie del valore delle scuole o correnti
economiche che si sono succedute, sto rivisitando la mia teoria del valore
basata sui surplus generati dai quattro tipi di informazione: termodinamica
o naturale (neg-entropia), genetica (DNA-RNA-proteine), matematica (bit di
entropia) e semantica (significato ottenibile con un s-codice che si
sovrappone ad una fonte di informazione equiprobabile). In fondo, non
bisogna fare altre che una ri-unificazione del sapere come sostiene anche
Jerry Chandler. Ho sempre sostenuto che la natura e la società hanno
un'armonia meravigliosa, non inficiata dai disaccordi, dai contrasti e
dalle diversità pur esistenti. L'ultimo messaggio che ho inviato il 3
agosto a conclusione della discussione precedente ribadisce quindi
l'esigenza inderogabile di com-prendere il possibile tutto o il tutto
possibile con la fusione (non con la confusione) degli orizzonti.
Un grazie a Ericsson-Zenith per l'abile introduzione che ci ha offerto e a
Pedro, regista inimitabile di queste magnifiche iniziative.
Francesco Rizzo


2015-09-11 18:58 GMT+02:00 Robert E. Ulanowicz :

> I'll have to weigh in with Stan on this one. Stan earlier had defined
> information more generally as "constraint". It is convenient to employ the
> IT calculus to separate constraint from indeterminacy. This is possible in
> complete abstraction from anything to do with communication.
>
> The ability to make this separation has wide-ranging consequences. For
> example, it provides a pathway by which process philosophy can be brought
> to bear on quantitative physical systems! It is no longer necessary to
> rely solely on positivist "objects moving according to law". That's no
> small advance!
>
> <
> https://www.ctr4process.org/whitehead2015/wp-content/uploads/2014/06/PhilPrax.pdf
> >
>
> The best,
> Bob
>
> > Pedro wrote"
> >
> >>Most attempts to enlarge informational thought and to extend it to life,
> > economies, societies, etc. continue to be but a reformulation of the
> > former
> > ideas with little added value.
> >
> > S: Well, I have generalized the Shannon concept of information carrying
> > capacity under 'variety'...  {variety {information carrying capacity}}.
> > This allows the concept to operate quite generally in evolutionary and
> > ecological discourses.  Information, then, if you like, is what is left
> > after a reduction in variety, or after some system choice.  Consider
> > dance:
> > we have all the possible conformations of the human body, out of which a
> > few are selected to provide information about the meaning of a dance.
> >
> > STAN
> >
> > STAN
> >
> > On Fri, Sep 11, 2015 at 8:22 AM, Pedro C. Marijuan <
> > pcmarijuan.i...@aragon.es> wrote:
> >
> >> Dear Steven and FIS colleagues,
> >>
> >> Many thanks for this opening text. What you are proposing about a pretty
> >> structured discussion looks a good idea, although it will have to
> >> confront the usually anarchic discussion style of FIS list! Two aspects
> >> of your initial text have caught my attention (apart from those videos
> >> you recommend that I will watch along the weekend).
> >>
> >> First about the concerns of a generation earlier (Shannon, Turing...)
> >> situating information in the intersection between physical science and
> >> engineering. The towering influence of this line of thought, both with
> >> positive and negative overtones, cannot be overestimated. Most attempts
> >> to enlarge informational thought and to extend it to life, economies,
> >> societies, etc. continue to be but a reformulation of the former ideas
> >> with little added value. See one of the last creatures: "Why Information
> >> Grows: The Evolution of Order, from Atoms to Economies" (2015), by Cesar
> >> Hidalgo (prof. at MIT).
> >>
> >> In my opinion, the extension of those classic ideas to life are very
> >> fertile from the technological point of view, from the "theory of
> >> molecular machines" for DNA-RNA-protein matching to genomic-proteomic
> >> and other omics'  "big data". But all that technobrilliance does not
> >> open per se new avenues in order to produce innovative thought about the
> >> information stuff of human societies. Alternatively we may think that
> >> the accelerated digitalization of our world and the cyborg-symbiosis of
> >> human information and computer information do not demand much brain
> >> teasing, as it is a matter that social evolution is superseding by
> >> itself.
> >>
> >> The point I have ocasionally raised in this list is whether all the new
> >> molecular knowledge about life might teach us about a fundamental
> >> difference in the "way of being in the world" between life and inert
> >> matter (& mechanism 

Re: [Fis] Information and Locality Introduction

2015-09-11 Thread Pedro C. Marijuan

Dear Steven and FIS colleagues,

Many thanks for this opening text. What you are proposing about a pretty
structured discussion looks a good idea, although it will have to
confront the usually anarchic discussion style of FIS list! Two aspects
of your initial text have caught my attention (apart from those videos
you recommend that I will watch along the weekend).

First about the concerns of a generation earlier (Shannon, Turing...)
situating information in the intersection between physical science and
engineering. The towering influence of this line of thought, both with
positive and negative overtones, cannot be overestimated. Most attempts
to enlarge informational thought and to extend it to life, economies,
societies, etc. continue to be but a reformulation of the former ideas
with little added value. See one of the last creatures: "Why Information
Grows: The Evolution of Order, from Atoms to Economies" (2015), by Cesar
Hidalgo (prof. at MIT).

In my opinion, the extension of those classic ideas to life are very
fertile from the technological point of view, from the "theory of
molecular machines" for DNA-RNA-protein matching to genomic-proteomic
and other omics'  "big data". But all that technobrilliance does not
open per se new avenues in order to produce innovative thought about the
information stuff of human societies. Alternatively we may think that
the accelerated digitalization of our world and the cyborg-symbiosis of
human information and computer information do not demand much brain
teasing, as it is a matter that social evolution is superseding by itself.

The point I have ocasionally raised in this list is whether all the new
molecular knowledge about life might teach us about a fundamental
difference in the "way of being in the world" between life and inert
matter (& mechanism & computation)---or not. In the recent compilation
by Plamen and colleagues from the former INBIOSA initiative,  I have
argued about that fundamental difference in the intertwining of
communication/self-production, how signaling is strictly caught in the
advancement of a life cycle  (see paper "How the living is in the
world"). Life is based on an inusitate informational formula unknown in
inert matter. And the very organization of life provides an original
starting point to think anew about information --of course, not the only
one.

So, to conclude this "tangent", I find quite exciting the discussion we
are starting now, say from the classical info positions onwards, in
particularly to be compared in some future with another session (in
preparation) with similar ambition but starting from say the
phenomenology of the living. Struggling for a
convergence/complementarity of outcomes would be a cavalier effort.

All the best--Pedro



Steven Ericsson-Zenith wrote:
...The subject is one that has concerned me ever since I completed my PhD in 1992. I came away from defending my thesis, essentially on large scale parallel computation, with the strong intuition that I had disclosed much more concerning the little that we know, than I had offered either a theoretical or engineering solution. 


For the curious, a digital copy of this thesis can be found among the reports 
of CRI, MINES ParisTech, formerly ENSMP, 
http://www.cri.ensmp.fr/classement/doc/A-232.pdf, it is also available as a 
paper copy on Amazon.

Like many that have been involved in microprocessor and instruction 
set/language design, using mathematical methods, we share the physical concerns 
of a generation earlier, people like John Von Neumann, Alan Turing, and Claude 
Shannon. In other words, a close intersection between physical science and 
machine engineering.

...I will then discuss some historical issues in particular referencing 
Benjamin Peirce, Albert Einstein and Alan Turing. And finally discuss the 
contemporary issues, as I see them, in biophysics, biology, and associated 
disciplines, reaching into human and other social constructions, perhaps 
touching on cosmology and the extended role of information theory in 
mathematical physics...


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
  



--
-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta X
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 (& 6818)
pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Information and Locality Introduction

2015-09-11 Thread HowlBloom

 
 
In a message dated 9/11/2015 8:15:48 A.M. Eastern Daylight Time,  
pcmarijuan.i...@aragon.es writes:

Dear Steven and FIS colleagues,

Many thanks for this opening  text. What you are proposing about a pretty
structured discussion looks a  good idea, although it will have to
confront the usually anarchic  discussion style of FIS list! Two aspects
of your initial text have caught  my attention (apart from those videos
you recommend that I will watch along  the weekend).

First about the concerns of a generation earlier  (Shannon, Turing...)
situating information in the intersection between  physical science and
engineering. The towering influence of this line of  thought, both with
positive and negative overtones, cannot be  overestimated. Most attempts
to enlarge informational thought and to extend  it to life, economies,
societies, etc. continue to be but a reformulation  of the former ideas
with little added value. See one of the last creatures:  "Why Information
Grows: The Evolution of Order, from Atoms to Economies"  (2015), by Cesar
Hidalgo (prof. at MIT).

In my opinion, the  extension of those classic ideas to life are very
fertile from the  technological point of view, from the "theory of
molecular machines" for  DNA-RNA-protein matching to genomic-proteomic
and other omics'  "big  data". But all that technobrilliance does not
open per se new avenues in  order to produce innovative thought about the
information stuff of human  societies. Alternatively we may think that
the accelerated digitalization  of our world and the cyborg-symbiosis of
human information and computer  information do not demand much brain
teasing, as it is a matter that social  evolution is superseding by itself.

The point I have ocasionally raised  in this list is whether all the new
molecular knowledge about life might  teach us about a fundamental
difference in the "way of being in the world"  between life and inert
matter (& mechanism & computation)---or not.  In the recent compilation
by Plamen and colleagues from the former INBIOSA  initiative,  I have
argued about that fundamental difference in the  intertwining of
communication/self-production, how signaling is strictly  caught in the
advancement of a life cycle  (see paper "How the living  is in the
world"). Life is based on an inusitate informational formula  unknown in
inert matter. And the very organization of life provides an  original
starting point to think anew about information --of course, not  the only
one.

So, to conclude this "tangent", I find quite exciting  the discussion we
are starting now, say from the classical info positions  onwards, in
particularly to be compared in some future with another session  (in
preparation) with similar ambition but starting from say  the
phenomenology of the living. Struggling for  a
convergence/complementarity of outcomes would be a cavalier  effort.

All the best--Pedro



Steven Ericsson-Zenith  wrote:
> ...The subject is one that has concerned me ever since I  completed my 
PhD in 1992. I came away from defending my thesis, essentially on  large scale 
parallel computation, with the strong intuition that I had  disclosed much 
more concerning the little that we know, than I had offered  either a 
theoretical or engineering solution. 
>
> For the  curious, a digital copy of this thesis can be found among the 
reports of CRI,  MINES ParisTech, formerly ENSMP,  
http://www.cri.ensmp.fr/classement/doc/A-232.pdf, it is also available as a  
paper copy on Amazon.
>
> Like many that have been involved in  microprocessor and instruction 
set/language design, using mathematical  methods, we share the physical 
concerns 
of a generation earlier, people like  John Von Neumann, Alan Turing, and 
Claude Shannon. In other words, a close  intersection between physical science 
and machine engineering.
>
>  ...I will then discuss some historical issues in particular referencing  
Benjamin Peirce, Albert Einstein and Alan Turing. And finally discuss the  
contemporary issues, as I see them, in biophysics, biology, and associated  
disciplines, reaching into human and other social constructions, perhaps  
touching on cosmology and the extended role of information theory in  
mathematical physics...
>
>
>  ___
> Fis mailing  list
> Fis@listas.unizar.es
>  http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>


-- 
-
Pedro  C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto  Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de  Aragón (CIBA)
Avda. San Juan Bosco, 13, planta X
50009 Zaragoza,  Spain
Tfno. +34 976 71 3526 (&  6818)
pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-


___
Fis  mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


 
 

Re: [Fis] Information and Locality Introduction

2015-09-11 Thread Jerry LR Chandler
Dear Steven, Pedro and List:

Two excellent posts!

Steven:  I look forward to your ratiocinations and there connectivity with 
symbolic logic. 

It is my view that one of the foundational stumbling blocks to communication 
about syntactical information theory (and its exactness!) is the multi-meanings 
that emerge from the multiple symbol systems used by the natural sciences.

Stan's post is a superb example of how anyone change the semantic meaning of 
words and talk about personal philosophy in context that ignores the 
syntactical meaning of the same word such that the exact sciences are 
generated.  Of course, this personal philosophy remains a private conversation. 

 Steven and Pedro (and I), by way of contrast, are seeking a discussion of 
public information and the exactness of public information theory.  

Cheers

Jerry


Words to live by:

"The union of units unifies the unity of the universe"


 

On Sep 11, 2015, at 7:22 AM, Pedro C. Marijuan wrote:

> Dear Steven and FIS colleagues,
> 
> Many thanks for this opening text. What you are proposing about a pretty
> structured discussion looks a good idea, although it will have to
> confront the usually anarchic discussion style of FIS list! Two aspects
> of your initial text have caught my attention (apart from those videos
> you recommend that I will watch along the weekend).
> 
> First about the concerns of a generation earlier (Shannon, Turing...)
> situating information in the intersection between physical science and
> engineering. The towering influence of this line of thought, both with
> positive and negative overtones, cannot be overestimated. Most attempts
> to enlarge informational thought and to extend it to life, economies,
> societies, etc. continue to be but a reformulation of the former ideas
> with little added value. See one of the last creatures: "Why Information
> Grows: The Evolution of Order, from Atoms to Economies" (2015), by Cesar
> Hidalgo (prof. at MIT).
> 
> In my opinion, the extension of those classic ideas to life are very
> fertile from the technological point of view, from the "theory of
> molecular machines" for DNA-RNA-protein matching to genomic-proteomic
> and other omics'  "big data". But all that technobrilliance does not
> open per se new avenues in order to produce innovative thought about the
> information stuff of human societies. Alternatively we may think that
> the accelerated digitalization of our world and the cyborg-symbiosis of
> human information and computer information do not demand much brain
> teasing, as it is a matter that social evolution is superseding by itself.
> 
> The point I have ocasionally raised in this list is whether all the new
> molecular knowledge about life might teach us about a fundamental
> difference in the "way of being in the world" between life and inert
> matter (& mechanism & computation)---or not. In the recent compilation
> by Plamen and colleagues from the former INBIOSA initiative,  I have
> argued about that fundamental difference in the intertwining of
> communication/self-production, how signaling is strictly caught in the
> advancement of a life cycle  (see paper "How the living is in the
> world"). Life is based on an inusitate informational formula unknown in
> inert matter. And the very organization of life provides an original
> starting point to think anew about information --of course, not the only
> one.
> 
> So, to conclude this "tangent", I find quite exciting the discussion we
> are starting now, say from the classical info positions onwards, in
> particularly to be compared in some future with another session (in
> preparation) with similar ambition but starting from say the
> phenomenology of the living. Struggling for a
> convergence/complementarity of outcomes would be a cavalier effort.
> 
> All the best--Pedro
> 
> 
> 
> Steven Ericsson-Zenith wrote:
>> ...The subject is one that has concerned me ever since I completed my PhD in 
>> 1992. I came away from defending my thesis, essentially on large scale 
>> parallel computation, with the strong intuition that I had disclosed much 
>> more concerning the little that we know, than I had offered either a 
>> theoretical or engineering solution. 
>> For the curious, a digital copy of this thesis can be found among the 
>> reports of CRI, MINES ParisTech, formerly ENSMP, 
>> http://www.cri.ensmp.fr/classement/doc/A-232.pdf, it is also available as a 
>> paper copy on Amazon.
>> 
>> Like many that have been involved in microprocessor and instruction 
>> set/language design, using mathematical methods, we share the physical 
>> concerns of a generation earlier, people like John Von Neumann, Alan Turing, 
>> and Claude Shannon. In other words, a close intersection between physical 
>> science and machine engineering.
>> 
>> ...I will then discuss some historical issues in particular referencing 
>> Benjamin Peirce, Albert Einstein and Alan Turing. And finally discuss the 
>> contemporary issues, as I 

[Fis] Information and Locality Introduction

2015-09-09 Thread Steven Ericsson-Zenith
Dear List,

This is the start of the next FIS discussion. And this is the first of several 
emails kicking the discussion off and divided into logical parts so as not to 
confront the reader with too many ideas and too much text at once.

The subject is one that has concerned me ever since I completed my PhD in 1992. 
I came away from defending my thesis, essentially on large scale parallel 
computation, with the strong intuition that I had disclosed much more 
concerning the little that we know, than I had offered either a theoretical or 
engineering solution. 

For the curious, a digital copy of this thesis can be found among the reports 
of CRI, MINES ParisTech, formerly ENSMP, 
http://www.cri.ensmp.fr/classement/doc/A-232.pdf, it is also available as a 
paper copy on Amazon.

Like many that have been involved in microprocessor and instruction 
set/language design, using mathematical methods, we share the physical concerns 
of a generation earlier, people like John Von Neumann, Alan Turing, and Claude 
Shannon. In other words, a close intersection between physical science and 
machine engineering.

So I wish to proceed as follows, especially since this is a cross disciplinary 
group: 

First identify a statement of the domain, what is it that I, in particular, 
speak of when we use the term “Information.” I will clarify as necessary. I 
will then discuss the issue of locality, what I think that issue is and why it 
is a problem. Here we will get into several topics of classical discussion. I 
will briefly present my own mathematics for the problem in an informal yet 
rigorous style, reaching into the foundations of logic. 

I will then discuss some historical issues in particular referencing Benjamin 
Peirce, Albert Einstein and Alan Turing. And finally discuss the contemporary 
issues, as I see them, in biophysics, biology, and associated disciplines, 
reaching into human and other social constructions, perhaps touching on 
cosmology and the extended role of information theory in mathematical physics.

This will seem very broad but in all cases I will focus upon the issues of 
locality they each present.

Before my preparations for these discussions I surveyed existing pedagogical 
work to see how our science is currently presented and I came across the Khan 
Academy video series on Information Theory, authored by Brit Cruise. 

As flawed as I find this work, it is none-the-less an adequate place for us to 
start and to build upon. It does a good job in briefly presenting the work of 
Claude Shannon and others, in its second part on Modern Information Theory. 

I especially encourage advanced readers to take the few minutes it will take to 
review the Origin of Markov Chains, A Mathematical Theory of Communication, 
Information Entropy, Compression Codes and Error Correction to set the field 
and ensure that we are on the same page. You may also find the final video on 
SETI work interesting, it will be relevant as we proceed. 

You can review these short videos on YouTube and here:

https://www.khanacademy.org/computing/computer-science/informationtheory

or here:

https://www.youtube.com/playlist?list=PLbg3ZX2pWlgKDVFNwn9B63UhYJVIerzHL

I invite you to review these videos as the context for my next posting that 
will be a discussion of what is good about this model, locality, and what is, I 
now argue, fundamentally missing or wrong headed.

Pedro, at the end of this I will aggregate these parts for the FIS wiki.

Regards,
Steven

--
Dr. Steven Ericsson-Zenith, Los Gatos, California. +1-650-308-8611
http://iase.info



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis