This is the start of the next FIS discussion. And this is the first of several
emails kicking the discussion off and divided into logical parts so as not to
confront the reader with too many ideas and too much text at once.
The subject is one that has concerned me ever since I completed my PhD in 1992.
I came away from defending my thesis, essentially on large scale parallel
computation, with the strong intuition that I had disclosed much more
concerning the little that we know, than I had offered either a theoretical or
For the curious, a digital copy of this thesis can be found among the reports
of CRI, MINES ParisTech, formerly ENSMP,
http://www.cri.ensmp.fr/classement/doc/A-232.pdf, it is also available as a
paper copy on Amazon.
Like many that have been involved in microprocessor and instruction
set/language design, using mathematical methods, we share the physical concerns
of a generation earlier, people like John Von Neumann, Alan Turing, and Claude
Shannon. In other words, a close intersection between physical science and
So I wish to proceed as follows, especially since this is a cross disciplinary
First identify a statement of the domain, what is it that I, in particular,
speak of when we use the term “Information.” I will clarify as necessary. I
will then discuss the issue of locality, what I think that issue is and why it
is a problem. Here we will get into several topics of classical discussion. I
will briefly present my own mathematics for the problem in an informal yet
rigorous style, reaching into the foundations of logic.
I will then discuss some historical issues in particular referencing Benjamin
Peirce, Albert Einstein and Alan Turing. And finally discuss the contemporary
issues, as I see them, in biophysics, biology, and associated disciplines,
reaching into human and other social constructions, perhaps touching on
cosmology and the extended role of information theory in mathematical physics.
This will seem very broad but in all cases I will focus upon the issues of
locality they each present.
Before my preparations for these discussions I surveyed existing pedagogical
work to see how our science is currently presented and I came across the Khan
Academy video series on Information Theory, authored by Brit Cruise.
As flawed as I find this work, it is none-the-less an adequate place for us to
start and to build upon. It does a good job in briefly presenting the work of
Claude Shannon and others, in its second part on Modern Information Theory.
I especially encourage advanced readers to take the few minutes it will take to
review the Origin of Markov Chains, A Mathematical Theory of Communication,
Information Entropy, Compression Codes and Error Correction to set the field
and ensure that we are on the same page. You may also find the final video on
SETI work interesting, it will be relevant as we proceed.
You can review these short videos on YouTube and here:
I invite you to review these videos as the context for my next posting that
will be a discussion of what is good about this model, locality, and what is, I
now argue, fundamentally missing or wrong headed.
Pedro, at the end of this I will aggregate these parts for the FIS wiki.
Dr. Steven Ericsson-Zenith, Los Gatos, California. +1-650-308-8611
Fis mailing list