Dear colleagues,

I am entirely in agreement with the sentiments about mutual respect that
Loet recommends and the "harmony of knowledge" that Francesco promotes. But
I believe that this must also include a willingness to recognize that there
isn't a most basic theory; only what we might characterize as a currently
most thoroughly worked out analysis. But this is an analysis at the most
stripped down level—and which therefore necessarily ignores much that is
essential to a fuller analysis of information.


In this respect Loet comments:


"In my opinion, the status of Shannon’s mathematical theory of information
is different  from special theories of information (e.g., biological ones)
since the formal theory enables us to translate between these latter
theories."


We are essentially in agreement, and yet I would invert any perspective
that prioritizes the approach pioneered by Shannon. This analysis of the
signal properties that are necessary for conveying information does not
attempt to address the "higher order" properties that we pay attention to
in domains where reference and functional value are relevant (e.g. biology,
neuroscience, sociology, art). It necessarily brackets these aspects from
consideration. It thereby provides a common necessary but not sufficient tool
of analysis. More than a half century of development along these lines has
demonstrated that there are critical features of the information
relationship that cannot be reduced to intrinsic signal properties.


I have argued that there are basically two higher-order general properties
that constitute information: the referential relation and the
normative/functional value relation (with the term 'meaning' often used
somewhat ambiguously to refer to one or both of these properties). I do not
assume that these completely characterize all higher-order properties, and
so I would be open to discussing additional general attributes that fall
outside these domains, and which we need to also consider.


So I am not a fan of prioritizing the statistical conception of information
and considering all others to be "special" theories.


My hope for the field is that we will continue to work toward formalization
of these higher-order properties with the aim of embedding our current
"signal property analysis" within this larger theory. In this respect, I
would argue that the "mathematical theory" as currently developed is in
fact a "special theory," restricted to analyses where reference and
functional significance can be set aside (as in engineering applications),
and that the "general theory" remains to be formulated.


Since its inception, it has been recognized that the "mathematical theory
of communication" has used the term 'information' in a highly atypical
sense. I think that we would do well to keep this historical "accident" in
mind in order to avoid "information fundamentalism." This demands a sort of
humility in the face of the enormity of the challenge before us, not merely
a tolerance of "special" domains of application that don't completely
reduce to statistical analysis.


My proposal is that agreeing on terminological distinctions that support
such a paradigm inversion might provide a first step toward theoretical
convergence toward a "general theory" of information. I would welcome such
a discussion in the new year.


Happy holidays to all, Terry

On Sat, Dec 24, 2016 at 2:22 AM, Francesco Rizzo <
13francesco.ri...@gmail.com> wrote:

> Cari Tutti,
> ho scritto più volte le stesse cose per cui sono d'accordo con Voi,
> specialmente con gli ultimi intervenuti. E dato che sono un forestiero
> rispetto alle Vostre discipline, ma non uno straniero dell'armonia del
> sapere o del sapere dell'armonia, questo è una bella cosa. Auguri di buon
> Natale e per il nuovo anno.
> Francesco
>
> 2016-12-24 7:45 GMT+01:00 Loet Leydesdorff <l...@leydesdorff.net>:
>
>> Dear Terrence and colleagues,
>>
>>
>>
>> I agree that we should not be fundamentalistic about “information”. For
>> example, one can also use “uncertainty” as an alternative word to
>> Shannon-type “information”. One can also make distinctions other than
>> semantic/syntactic/pragmatic, such as biological information, etc.
>>
>>
>>
>> Nevertheless, what makes this list to a common platform, in my opinion,
>> is our interest in the differences and similarities in the background of
>> these different notions of information. In my opinion, the status of
>> Shannon’s mathematical theory of information is different  from special
>> theories of information (e.g., biological ones) since the formal theory
>> enables us to translate between these latter theories. The translations are
>> heuristically important: they enable us to import metaphors from other
>> backgrounds (e.g., auto-catalysis).
>>
>>
>>
>> For example, one of us communicated with me why I was completely wrong,
>> and made the argument with reference to Kullback-Leibler divergence between
>> two probability distributions. Since we probably will not have “a general
>> theory” of information, the apparatus in which information is formally and
>> operationally defined—Bar-Hillel once called it “information calculus”—can
>> carry this interdisciplinary function with precision and rigor. Otherwise,
>> we can only be respectful of each other’s research traditions. J
>>
>>
>>
>> I wish you all a splendid 2017,
>>
>> Loet
>>
>>
>> ------------------------------
>>
>> Loet Leydesdorff
>>
>> Professor, University of Amsterdam
>> Amsterdam School of Communication Research (ASCoR)
>>
>> l...@leydesdorff.net ; http://www.leydesdorff.net/
>> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
>> Sussex;
>>
>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>> Hangzhou; Visiting Professor, ISTIC,
>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>>
>> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
>> London;
>>
>> http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
>>
>>
>>
>> *From:* Fis [mailto:fis-boun...@listas.unizar.es] *On Behalf Of *Terrence
>> W. DEACON
>> *Sent:* Thursday, December 22, 2016 5:33 AM
>> *To:* fis
>>
>> *Subject:* Re: [Fis] What is information? and What is life?
>>
>>
>>
>> Against information fundamentalism
>>
>>
>>
>> Rather than fighting over THE definition of information, I suggest that
>> we stand back from the polemics for a moment and recognize that the term is
>> being used in often quite incompatible ways in different domains, and that
>> there may be value in paying attention to the advantages and costs of each.
>> To ignore these differences, to fail to explore the links and dependencies
>> between them, and to be indifferent to the different use values gained or
>> sacrificed by each, I believe that we end up undermining the very
>> enterprise we claim to be promoting.
>>
>>
>>
>> We currently lack broadly accepted terms to unambiguously distinguish
>> these divergent uses and, even worse, we lack a theoretical framework for
>> understanding their relationships to one another.
>>
>> So provisionally I would argue that we at least need to distinguish three
>> hierarchically related uses of the concept:
>>
>>
>>
>> 1. Physical information: Information as intrinsically measurable medium
>> properties with respect to their capacity to support 2 or 3 irrespective of
>> any specific instantiation of 2 or 3.
>>
>>
>>
>> 2. Referential information: information as a non-intrinsic relation to
>> something other than medium properties (1) that a given medium can provide
>> (i.e. reference or content) irrespective of any specific instantiation of 3.
>>
>>
>>
>> 3. Normative information: Information as the use value provided by a
>> given referential relation (2) with respect to an end-directed dynamic that
>> is susceptible to contextual factors that are not directly accessible (i.e.
>> functional value or significance).
>>
>>
>>
>> Unfortunately, because of the history of using the same term in an
>> unmodified way in each relevant domain irrespective of the others there are
>> often pointless arguments of a purely definitional nature.
>>
>>
>>
>> In linguistic theory an analogous three-part hierarchic partitioning of
>> theory IS widely accepted.
>>
>>
>>
>> 1. syntax
>>
>> 2. semantics
>>
>> 3. pragmatics
>>
>>
>>
>> Thus by analogy some have proposed the distinction between
>>
>>
>>
>> 1. syntactic information (aka Shannon)
>>
>> 2. semantic information (aka meaning)
>>
>> 3. pragmatic information (aka useful information)
>>
>>
>>
>> This has also often been applied to the philosophy of information (e.g.
>> see The Stanford Dictionary of Philosophy entry for ‘information’).
>> Unfortunately, the language-centric framing of this distinction can be
>> somewhat misleading. The metaphoric extension of the terms ‘syntax’ and
>> ‘semantics’ to apply to iconic (e.g. pictorial) or indexical (e.g.
>> correlational) forms of communication exerts a subtle procrustean influence
>> that obscures their naturalistic and nondigital features. This language
>> bias is also often introduced with the term ‘meaning’ because of its
>> linguistic connotations (i.e. does a sneeze have a meaning? Not in any
>> standard sense. But it provides information “about” the state of person who
>> sneezed.)
>>
>>
>>
>> So as a first rough terminological distinction I propose using
>>
>>
>>
>> 1. physical information (or perhaps information1)
>>
>> 2. referential information (information2)
>>
>> 3. normative information (information3)
>>
>>
>>
>> to avoid definitional equivocation and the loss of referential clarity.
>>
>>
>>
>> I would argue that we use the term ‘information’ in a prescinded way in
>> both 1 and 2. That is, considered from the perspective of a potential
>> interpretation (3) we can bracket consideration of any particular
>> interpretation to assess the possible relational properties that are
>> available to provide reference (2); and we can bracket both 3 and 2 to only
>> consider the medium/signal properties minimally available for 2 and 3
>> irrespective of using them for these purposes.*
>>
>>
>>
>> Although 2 and 3 are not quantifiable in the same sense that 1 is,
>> neither are they unconstrained or merely subjective. The possible
>> referential content of a given medium or sign vehicle is constrained by the
>> physical properties of the medium and its relationship to its physical
>> context. Normative information captures the way that referential content
>> can be correct or incorrect, accurate or inaccurate, useful or useless,
>> etc., depending on the requirements of the interpretive system and its
>> relation to the context. In both cases there are specific unambiguously
>> identifiable constraints on reference and normative value.
>>
>>
>>
>> There has been a prejudice in favor of 1 because of the (mistaken) view
>> that 2 and three are in some deep sense nonphysical and subjective.
>> Consistent with this view, there have been many efforts to find a way to
>> reduce 2 and 3 to some expression of 1. Although it is often remarked that
>> introducing non reduced concepts of referential content (2) and normative
>> evaluation (3) into the theory of information risks introducing non
>> quantifiable (and by assumption non scientific) attributes, I think that
>> this is more a prejudice than a principle that has been rigorously
>> demonstrated. Even if there is currently no widely accepted non
>> reductionistic formalization of reference and significance within the
>> information sciences this is not evidence that it cannot be achieved. One
>> thing is clear, however, until we find a way to use the term ‘information’
>> in a way that does not privilege one of these uses over the others and
>> unequivocally distinguishes each and their relationships to one another,
>> the debates we engage in on this forum will remain interminable.
>>
>>
>>
>> So I suggest that we commence a discussion of how best to accomplish this
>> terminological brush-clearing before further debating the relevance of
>> information to physics, logic, biology, or art. I apologize if this is
>> already accepted as “solved” by some readers, and would be glad to receive
>> and share your different taxonomies and learn of how they are justified.
>>
>>
>>
>> — Terry
>>
>>
>>
>> * Stan Salthe might organize them in a subsumptive hierarchy.
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> On Tue, Dec 20, 2016 at 4:19 PM, Mark Johnson <johnsonm...@gmail.com>
>> wrote:
>>
>> Dear all,
>>
>>
>>
>> It's important that one should remain practical. Shannon's formulae are
>> practical. The correspondence with certain tenets of cybernetics such as
>> Ashby's Law, or Maturana's "Structural Coupling" presents Shannon as a
>> window for exploring *relations* empirically. This I understand to be Bob
>> Ulanowicz's focus too. I think Ashby's epistemology which accompanied his
>> championing of Shannon (and which seems to me to be quite radical) is worth
>> a much deeper exploration (it was eclipsed by second-order cybernetics in
>> the early 70s). Klaus Krippendorff wrote an excellent paper about this
>> here: http://repository.upenn.edu/cgi/viewcontent.cgi?articl
>> e=1245&context=asc_papers
>>
>>
>>
>> Information theory is counting - but it provides a way of measuring
>> relations, which I think marks it out as distinct from other statistical
>> techniques such as variance. It also provides the basis for questioning
>> what we actually mean by counting in the first place: you might call it
>> "critical counting". For example, Ashby makes the comment about "analogy"
>> (a key concept if we are to say that one thing is the same class as another
>> when we count them together)... (apologies because I can't find the
>> reference to this right now, but will send if anyone is interested):
>>
>>
>>
>> "The principle of analogy is founded upon the assumption that a degree of
>> likeness between two objects in respect of their known qualities is some
>> reason for expecting a degree of likeness between them in respect of their
>> unknown qualities also, and that the probability with which unascertained
>> similarities are to be expected depends upon the amount of likeness already
>> known."
>>
>>
>>
>> Also, just to correct a possible misconception: I don't think counting
>> leads to populism. Econometrics has led to populism. Some of the greatest
>> economists of the 20th century saw the problem - this is why Keynes wrote a
>> book on probability, and Hayek wrote extensively criticising mathematical
>> modelling. In the end, I'm afraid, it's an American problem which goes back
>> to McCarthy and the distrust of criticality in the social sciences in
>> favour of positivist mathematical "objectivist" approaches. Those schools
>> in the US which championed mathematical approaches (Chicago, etc) got all
>> the funding, controlled the journals, whilst others were starved. The
>> legacy from the 1950s is still with us: it's still very hard to get an
>> economics paper published unless it's got crazy equations in it. In the
>> end, it's just bad theory - and bad mathematics.
>>
>>
>>
>> We could well see a similar thing happen with climate science in the next
>> four years.
>>
>>
>>
>> Best wishes,
>>
>>
>>
>> Mark
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> On 20 December 2016 at 19:55, Bob Logan <lo...@physics.utoronto.ca>
>> wrote:
>>
>> Loet - thanks for the mention of our (Kauffman, Logan et al) definition
>> our definition of information which is a qualitative description of
>> information. As to whether one can measure information with our
>> description, my response is no but I am not sure that one can measure
>> information at all. What units would one use to measure information? *E* =
>> mc 2 contains a lot of information but the amount of information depends
>> on context. A McLuhan one-liner such as 'the medium is the message' also
>> contains a lot of information even though it is only 5 words or 26
>> characters long.
>>
>>
>>
>> Hopefully I have provided some information but how much information is
>> impossible to measure.
>>
>>
>>
>> Bob
>>
>>
>>
>>
>>
>>
>>
>> ______________________
>>
>>
>>
>> Robert K. Logan
>>
>> Prof. Emeritus - Physics - U. of Toronto
>>
>> Fellow University of St. Michael's College
>>
>> Chief Scientist - sLab at OCAD
>>
>> http://utoronto.academia.edu/RobertKLogan
>>
>> www.researchgate.net/profile/Robert_Logan5/publications
>>
>> https://www.physics.utoronto.ca/people/homepages/logan/
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> On Dec 20, 2016, at 3:26 AM, Loet Leydesdorff <l...@leydesdorff.net>
>> wrote:
>>
>>
>>
>> Dear colleagues,
>>
>>
>>
>> A distribution contains uncertainty that can be measured in terms of bits
>> of information.
>>
>> Alternatively: the expected information content *H *of a probability
>> distribution is .
>>
>> *H* is further defined as probabilistic entropy using Gibb’s formulation
>> of the entropy .
>>
>>
>>
>> This definition of information is an operational definition. In my
>> opinion, we do not need an essentialistic definition by answering the
>> question of “what is information?” As the discussion on this list
>> demonstrates, one does not easily agree on an essential answer; one can
>> answer the question “how is information defined?” Information is not
>> “something out there” which “exists” otherwise than as our construct.
>>
>>
>>
>> Using essentialistic definitions, the discussion tends not to move
>> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
>> of information “as natural selection assembling the very constraints on the
>> release of energy that then constitutes work and the propagation of
>> organization.” I asked several times what this means and how one can
>> measure this information. Hitherto, I only obtained the answer that
>> colleagues who disagree with me will be cited. J Another answer was that
>> “counting” may lead to populism. J
>>
>>
>>
>> Best,
>>
>> Loet
>>
>>
>> ------------------------------
>>
>> Loet Leydesdorff
>>
>> Professor, University of Amsterdam
>> Amsterdam School of Communication Research (ASCoR)
>>
>> l...@leydesdorff.net  <l...@leydesdorff.net>; http://www.leydesdorff.net/
>>
>> Associate Faculty, SPRU,  <http://www.sussex.ac.uk/spru/>University of
>> Sussex;
>>
>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>> Hangzhou; Visiting Professor, ISTIC,
>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>>
>> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
>> London;
>>
>> http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
>>
>>
>>
>> *From:* Dick Stoute [mailto:dick.sto...@gmail.com <dick.sto...@gmail.com>
>> ]
>> *Sent:* Monday, December 19, 2016 12:48 PM
>> *To:* l...@leydesdorff.net
>> *Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
>> *Subject:* Re: [Fis] What is information? and What is life?
>>
>>
>>
>> List,
>>
>>
>>
>> Please allow me to respond to Loet about the definition of information
>> stated below.
>>
>>
>>
>> 1. the definition of information as uncertainty is counter-intuitive
>> ("bizarre"); (p. 27)
>>
>>
>>
>> I agree.  I struggled with this definition for a long time before
>> realising that Shannon was really discussing "amount of information" or the
>> number of bits needed to convey a message.  He was looking for a formula
>> that would provide an accurate estimate of the number of bits needed to
>> convey a message and realised that the amount of information (number of
>> bits) needed to convey a message was dependent on the "amount" of
>> uncertainty that had to be eliminated and so he equated these.
>>
>>
>>
>> It makes sense to do this, but we must distinguish between "amount of
>> information" and "information".  For example, we can measure amount of
>> water in liters, but this does not tell us what water is and likewise the
>> measure we use for "amount of information" does not tell us what
>> information is. We can, for example equate the amount of water needed to
>> fill a container with the volume of the container, but we should not think
>> that water is therefore identical to an empty volume.  Similarly we should
>> not think that information is identical to uncertainty.
>>
>>
>>
>> By equating the number of bits needed to convey a message with the
>> "amount of uncertainty" that has to be eliminated Shannon, in effect,
>> equated opposites so that he could get an estimate of the number of bits
>> needed to eliminate the uncertainty.  We should not therefore consider that
>> this equation establishes what information is.
>>
>>
>>
>> Dick
>>
>>
>>
>>
>>
>> On 18 December 2016 at 15:05, Loet Leydesdorff <l...@leydesdorff.net>
>> wrote:
>>
>> Dear James and colleagues,
>>
>>
>>
>> Weaver (1949) made two major remarks about his coauthor (Shannon)'s
>> contribution:
>>
>>
>>
>> 1. the definition of information as uncertainty is counter-intuitive
>> ("bizarre"); (p. 27)
>>
>> 2. "In particular, information must not be confused with meaning." (p. 8)
>>
>>
>>
>>
>> The definition of information as relevant for a system of reference
>> confuses information with "meaningful information" and thus sacrifices the
>> surplus value of Shannon's counter-intuitive definition.
>>
>>
>>
>> information observer
>>
>>
>>
>> that integrates interactive processes such as
>>
>>
>>
>> physical interactions such photons stimulating the retina of the eye,
>> human-machine interactions (this is the level that Shannon lives on),
>> biological interaction such body temperature relative to touch ice or heat
>> source, social interaction such as this forum started by Pedro, economic
>> interaction such as the stock market, ... [Lerner, page 1].
>>
>>
>>
>> We are in need of a theory of meaning. Otherwise, one cannot measure
>> meaningful information. In a previous series of communications we discussed
>> redundancy from this perspective.
>>
>>
>>
>> Lerner introduces mathematical expectation E[Sap] (difference between of
>> a priory entropy [sic] and a posteriori entropy), which is distinguished
>> from the notion of relative information Iap (Learner, page 7).
>>
>>
>>
>> ) expresses in bits of information the information generated when the a
>> priori distribution is turned into the a posteriori one . This follows
>> within the Shannon framework without needing an observer. I use this
>> equation, for example, in my 1995-book *The Challenge of Scientometrics* 
>> (Chapters
>> 8 and 9), with a reference to Theil (1972). The relative information is
>> defined as the *H*/*H*(max).
>>
>>
>>
>> I agree that the intuitive notion of information is derived from the
>> Latin “in-formare” (Varela, 1979). But most of us do no longer use “force”
>> and “mass” in the intuitive (Aristotelian) sense. J The proliferation of
>> the meanings of information if confused with “meaningful information” is
>> indicative for an “index sui et falsi”, in my opinion. The repetitive
>> discussion lames the progression at this list. It is “like asking whether a
>> glass is half empty or half full” (Hayles, 1990, p. 59).
>>
>>
>>
>> This act of forming forming an information process results in the
>> construction of an observer that is the owner [holder] of information.
>>
>>
>>
>> The system of reference is then no longer the message, but the observer
>> who provides meaning to the information (uncertainty). I agree that this is
>> a selection process, but the variation first has to be specified
>> independently (before it can be selected.
>>
>>
>>
>> And Lerner introduces the threshold between objective and subjective
>> observes (page 27).   This leads to a consideration selection and
>> cooperation that includes entanglement.
>>
>>
>>
>> I don’t see a direct relation between information and entanglement. An
>> observer can be entangled.
>>
>>
>>
>> Best,
>>
>> Loet
>>
>>
>>
>> PS. Pedro: Let me assume that this is my second posting in the week which
>> ends tonight. L.
>>
>>
>>
>>
>> _______________________________________________
>> Fis mailing list
>> Fis@listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>>
>>
>>
>>
>> --
>>
>>
>> 4 Austin Dr. Prior Park St. James, Barbados BB23004
>> Tel:   246-421-8855 <(246)%20421-8855>
>> Cell:  246-243-5938 <(246)%20243-5938>
>>
>> _______________________________________________
>> Fis mailing list
>> Fis@listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>>
>>
>>
>> _______________________________________________
>> Fis mailing list
>> Fis@listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>>
>>
>>
>>
>> --
>>
>> Dr. Mark William Johnson
>> Institute of Learning and Teaching
>>
>> Faculty of Health and Life Sciences
>>
>> University of Liverpool
>>
>>
>>
>> Phone: 07786 064505
>> Email: johnsonm...@gmail.com
>> Blog: http://dailyimprovisation.blogspot.com
>>
>>
>> _______________________________________________
>> Fis mailing list
>> Fis@listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>>
>>
>>
>>
>> --
>>
>> Professor Terrence W. Deacon
>> University of California, Berkeley
>>
>> _______________________________________________
>> Fis mailing list
>> Fis@listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>>
>


-- 
Professor Terrence W. Deacon
University of California, Berkeley
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to