Dear all,

It's important that one should remain practical. Shannon's formulae are
practical. The correspondence with certain tenets of cybernetics such as
Ashby's Law, or Maturana's "Structural Coupling" presents Shannon as a
window for exploring *relations* empirically. This I understand to be Bob
Ulanowicz's focus too. I think Ashby's epistemology which accompanied his
championing of Shannon (and which seems to me to be quite radical) is worth
a much deeper exploration (it was eclipsed by second-order cybernetics in
the early 70s). Klaus Krippendorff wrote an excellent paper about this
here:
http://repository.upenn.edu/cgi/viewcontent.cgi?article=1245&context=asc_papers

Information theory is counting - but it provides a way of measuring
relations, which I think marks it out as distinct from other statistical
techniques such as variance. It also provides the basis for questioning
what we actually mean by counting in the first place: you might call it
"critical counting". For example, Ashby makes the comment about "analogy"
(a key concept if we are to say that one thing is the same class as another
when we count them together)... (apologies because I can't find the
reference to this right now, but will send if anyone is interested):

"The principle of analogy is founded upon the assumption that a degree of
likeness between two objects in respect of their known qualities is some
reason for expecting a degree of likeness between them in respect of their
unknown qualities also, and that the probability with which unascertained
similarities are to be expected depends upon the amount of likeness already
known."

Also, just to correct a possible misconception: I don't think counting
leads to populism. Econometrics has led to populism. Some of the greatest
economists of the 20th century saw the problem - this is why Keynes wrote a
book on probability, and Hayek wrote extensively criticising mathematical
modelling. In the end, I'm afraid, it's an American problem which goes back
to McCarthy and the distrust of criticality in the social sciences in
favour of positivist mathematical "objectivist" approaches. Those schools
in the US which championed mathematical approaches (Chicago, etc) got all
the funding, controlled the journals, whilst others were starved. The
legacy from the 1950s is still with us: it's still very hard to get an
economics paper published unless it's got crazy equations in it. In the
end, it's just bad theory - and bad mathematics.

We could well see a similar thing happen with climate science in the next
four years.

Best wishes,

Mark




On 20 December 2016 at 19:55, Bob Logan <lo...@physics.utoronto.ca> wrote:

> Loet - thanks for the mention of our (Kauffman, Logan et al) definition
> our definition of information which is a qualitative description of
> information. As to whether one can measure information with our
> description, my response is no but I am not sure that one can measure
> information at all. What units would one use to measure information? *E* =
> mc 2 contains a lot of information but the amount of information depends
> on context. A McLuhan one-liner such as 'the medium is the message' also
> contains a lot of information even though it is only 5 words or 26
> characters long.
>
> Hopefully I have provided some information but how much information is
> impossible to measure.
>
> Bob
>
>
>
>
> ______________________
>
> Robert K. Logan
> Prof. Emeritus - Physics - U. of Toronto
> Fellow University of St. Michael's College
> Chief Scientist - sLab at OCAD
> http://utoronto.academia.edu/RobertKLogan
> www.researchgate.net/profile/Robert_Logan5/publications
> https://www.physics.utoronto.ca/people/homepages/logan/
>
>
>
>
>
>
>
>
>
>
>
>
> On Dec 20, 2016, at 3:26 AM, Loet Leydesdorff <l...@leydesdorff.net>
> wrote:
>
> Dear colleagues,
>
> A distribution contains uncertainty that can be measured in terms of bits
> of information.
> Alternatively: the expected information content *H *of a probability
> distribution is .
> *H* is further defined as probabilistic entropy using Gibb’s formulation
> of the entropy .
>
> This definition of information is an operational definition. In my
> opinion, we do not need an essentialistic definition by answering the
> question of “what is information?” As the discussion on this list
> demonstrates, one does not easily agree on an essential answer; one can
> answer the question “how is information defined?” Information is not
> “something out there” which “exists” otherwise than as our construct.
>
> Using essentialistic definitions, the discussion tends not to move
> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
> of information “as natural selection assembling the very constraints on the
> release of energy that then constitutes work and the propagation of
> organization.” I asked several times what this means and how one can
> measure this information. Hitherto, I only obtained the answer that
> colleagues who disagree with me will be cited. J Another answer was that
> “counting” may lead to populism. J
>
> Best,
> Loet
>
> ------------------------------
> Loet Leydesdorff
> Professor, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
> l...@leydesdorff.net  <l...@leydesdorff.net>; http://www.leydesdorff.net/
> Associate Faculty, SPRU,  <http://www.sussex.ac.uk/spru/>University of
> Sussex;
> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
> Hangzhou; Visiting Professor, ISTIC,
> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
> London;
> http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
>
> *From:* Dick Stoute [mailto:dick.sto...@gmail.com <dick.sto...@gmail.com>]
>
> *Sent:* Monday, December 19, 2016 12:48 PM
> *To:* l...@leydesdorff.net
> *Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
> *Subject:* Re: [Fis] What is information? and What is life?
>
> List,
>
> Please allow me to respond to Loet about the definition of information
> stated below.
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
>
>
> I agree.  I struggled with this definition for a long time before
> realising that Shannon was really discussing "amount of information" or the
> number of bits needed to convey a message.  He was looking for a formula
> that would provide an accurate estimate of the number of bits needed to
> convey a message and realised that the amount of information (number of
> bits) needed to convey a message was dependent on the "amount" of
> uncertainty that had to be eliminated and so he equated these.
>
>
>
> It makes sense to do this, but we must distinguish between "amount of
> information" and "information".  For example, we can measure amount of
> water in liters, but this does not tell us what water is and likewise the
> measure we use for "amount of information" does not tell us what
> information is. We can, for example equate the amount of water needed to
> fill a container with the volume of the container, but we should not think
> that water is therefore identical to an empty volume.  Similarly we should
> not think that information is identical to uncertainty.
>
>
>
> By equating the number of bits needed to convey a message with the "amount
> of uncertainty" that has to be eliminated Shannon, in effect, equated
> opposites so that he could get an estimate of the number of bits needed to
> eliminate the uncertainty.  We should not therefore consider that this
> equation establishes what information is.
>
>
>
> Dick
>
>
> On 18 December 2016 at 15:05, Loet Leydesdorff <l...@leydesdorff.net>
> wrote:
>
> Dear James and colleagues,
>
>
>
> Weaver (1949) made two major remarks about his coauthor (Shannon)'s
> contribution:
>
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
> 2. "In particular, information must not be confused with meaning." (p. 8)
>
>
>
> The definition of information as relevant for a system of reference
> confuses information with "meaningful information" and thus sacrifices the
> surplus value of Shannon's counter-intuitive definition.
>
>
>
> information observer
>
>
>
> that integrates interactive processes such as
>
>
>
> physical interactions such photons stimulating the retina of the eye,
> human-machine interactions (this is the level that Shannon lives on),
> biological interaction such body temperature relative to touch ice or heat
> source, social interaction such as this forum started by Pedro, economic
> interaction such as the stock market, ... [Lerner, page 1].
>
>
>
> We are in need of a theory of meaning. Otherwise, one cannot measure
> meaningful information. In a previous series of communications we discussed
> redundancy from this perspective.
>
>
>
> Lerner introduces mathematical expectation E[Sap] (difference between of a
> priory entropy [sic] and a posteriori entropy), which is distinguished from
> the notion of relative information Iap (Learner, page 7).
>
>
>
> ) expresses in bits of information the information generated when the a
> priori distribution is turned into the a posteriori one . This follows
> within the Shannon framework without needing an observer. I use this
> equation, for example, in my 1995-book *The Challenge of Scientometrics* 
> (Chapters
> 8 and 9), with a reference to Theil (1972). The relative information is
> defined as the *H*/*H*(max).
>
>
>
> I agree that the intuitive notion of information is derived from the Latin
> “in-formare” (Varela, 1979). But most of us do no longer use “force” and
> “mass” in the intuitive (Aristotelian) sense. J The proliferation of the
> meanings of information if confused with “meaningful information” is
> indicative for an “index sui et falsi”, in my opinion. The repetitive
> discussion lames the progression at this list. It is “like asking whether a
> glass is half empty or half full” (Hayles, 1990, p. 59).
>
>
>
> This act of forming forming an information process results in the
> construction of an observer that is the owner [holder] of information.
>
>
>
> The system of reference is then no longer the message, but the observer
> who provides meaning to the information (uncertainty). I agree that this is
> a selection process, but the variation first has to be specified
> independently (before it can be selected.
>
>
>
> And Lerner introduces the threshold between objective and subjective
> observes (page 27).   This leads to a consideration selection and
> cooperation that includes entanglement.
>
>
>
> I don’t see a direct relation between information and entanglement. An
> observer can be entangled.
>
>
>
> Best,
>
> Loet
>
>
>
> PS. Pedro: Let me assume that this is my second posting in the week which
> ends tonight. L.
>
>
>
>
> _______________________________________________
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>
>
> --
>
>
> 4 Austin Dr. Prior Park St. James, Barbados BB23004
> Tel:   246-421-8855 <(246)%20421-8855>
> Cell:  246-243-5938 <(246)%20243-5938>
> _______________________________________________
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>
> _______________________________________________
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>


-- 
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to