Loet - thanks for the mention of our (Kauffman, Logan et al) definition our 
definition of information which is a qualitative description of information. As 
to whether one can measure information with our description, my response is no 
but I am not sure that one can measure information at all. What units would one 
use to measure information? E = mc 2 contains a lot of information but the 
amount of information depends on context. A McLuhan one-liner such as 'the 
medium is the message' also contains a lot of information even though it is 
only 5 words or 26 characters long. 

Hopefully I have provided some information but how much information is 
impossible to measure.

Bob




______________________

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.researchgate.net/profile/Robert_Logan5/publications
https://www.physics.utoronto.ca/people/homepages/logan/












On Dec 20, 2016, at 3:26 AM, Loet Leydesdorff <l...@leydesdorff.net> wrote:

Dear colleagues, 
 
A distribution contains uncertainty that can be measured in terms of bits of 
information.
Alternatively: the expected information content H of a probability distribution 
is .
H is further defined as probabilistic entropy using Gibb’s formulation of the 
entropy .
 
This definition of information is an operational definition. In my opinion, we 
do not need an essentialistic definition by answering the question of “what is 
information?” As the discussion on this list demonstrates, one does not easily 
agree on an essential answer; one can answer the question “how is information 
defined?” Information is not “something out there” which “exists” otherwise 
than as our construct.
 
Using essentialistic definitions, the discussion tends not to move forward. For 
example, Stuart Kauffman’s and Bob Logan’s (2007) definition of information “as 
natural selection assembling the very constraints on the release of energy that 
then constitutes work and the propagation of organization.” I asked several 
times what this means and how one can measure this information. Hitherto, I 
only obtained the answer that colleagues who disagree with me will be cited. J 
Another answer was that “counting” may lead to populism. J
 
Best,
Loet
 
Loet Leydesdorff 
Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)
l...@leydesdorff.net  <mailto:l...@leydesdorff.net>; 
http://www.leydesdorff.net/ <http://www.leydesdorff.net/> 
Associate Faculty, SPRU,  <http://www.sussex.ac.uk/spru/>University of Sussex; 
Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>, Hangzhou; 
Visiting Professor, ISTIC,  <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
Visiting Professor,  <>Birkbeck <http://www.bbk.ac.uk/>, University of London; 
http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en 
<http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en>
 
From: Dick Stoute [mailto:dick.sto...@gmail.com <mailto:dick.sto...@gmail.com>] 
Sent: Monday, December 19, 2016 12:48 PM
To: l...@leydesdorff.net <mailto:l...@leydesdorff.net>
Cc: James Peters; u...@umces.edu <mailto:u...@umces.edu>; Alex Hankey; FIS 
Webinar
Subject: Re: [Fis] What is information? and What is life?
 
List,
 
Please allow me to respond to Loet about the definition of information stated 
below.  
 
1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)
 
I agree.  I struggled with this definition for a long time before realising 
that Shannon was really discussing "amount of information" or the number of 
bits needed to convey a message.  He was looking for a formula that would 
provide an accurate estimate of the number of bits needed to convey a message 
and realised that the amount of information (number of bits) needed to convey a 
message was dependent on the "amount" of uncertainty that had to be eliminated 
and so he equated these.  
 
It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of water in 
liters, but this does not tell us what water is and likewise the measure we use 
for "amount of information" does not tell us what information is. We can, for 
example equate the amount of water needed to fill a container with the volume 
of the container, but we should not think that water is therefore identical to 
an empty volume.  Similarly we should not think that information is identical 
to uncertainty.
 
By equating the number of bits needed to convey a message with the "amount of 
uncertainty" that has to be eliminated Shannon, in effect, equated opposites so 
that he could get an estimate of the number of bits needed to eliminate the 
uncertainty.  We should not therefore consider that this equation establishes 
what information is. 
 
Dick
 
 
On 18 December 2016 at 15:05, Loet Leydesdorff <l...@leydesdorff.net 
<mailto:l...@leydesdorff.net>> wrote:
Dear James and colleagues, 

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8) 

 

The definition of information as relevant for a system of reference confuses 
information with "meaningful information" and thus sacrifices the surplus value 
of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as 

 

physical interactions such photons stimulating the retina of the eye, 
human-machine interactions (this is the level that Shannon lives on), 
biological interaction such body temperature relative to touch ice or heat 
source, social interaction such as this forum started by Pedro, economic 
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Otherwise, one cannot measure meaningful 
information. In a previous series of communications we discussed redundancy 
from this perspective.

 

Lerner introduces mathematical expectation E[Sap] (difference between of a 
priory entropy [sic] and a posteriori entropy), which is distinguished from the 
notion of relative information Iap (Learner, page 7).

 

) expresses in bits of information the information generated when the a priori 
distribution is turned into the a posteriori one . This follows within the 
Shannon framework without needing an observer. I use this equation, for 
example, in my 1995-book The Challenge of Scientometrics (Chapters 8 and 9), 
with a reference to Theil (1972). The relative information is defined as the 
H/H(max).

 

I agree that the intuitive notion of information is derived from the Latin 
“in-formare” (Varela, 1979). But most of us do no longer use “force” and “mass” 
in the intuitive (Aristotelian) sense. J The proliferation of the meanings of 
information if confused with “meaningful information” is indicative for an 
“index sui et falsi”, in my opinion. The repetitive discussion lames the 
progression at this list. It is “like asking whether a glass is half empty or 
half full” (Hayles, 1990, p. 59). 

 

This act of forming forming an information process results in the construction 
of an observer that is the owner [holder] of information.

 

The system of reference is then no longer the message, but the observer who 
provides meaning to the information (uncertainty). I agree that this is a 
selection process, but the variation first has to be specified independently 
(before it can be selected.

 

And Lerner introduces the threshold between objective and subjective observes 
(page 27).   This leads to a consideration selection and cooperation that 
includes entanglement.

 

I don’t see a direct relation between information and entanglement. An observer 
can be entangled.

 

Best, 

Loet

 

PS. Pedro: Let me assume that this is my second posting in the week which ends 
tonight. L.

 


_______________________________________________
Fis mailing list
Fis@listas.unizar.es <mailto:Fis@listas.unizar.es>
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis 
<http://listas.unizar.es/cgi-bin/mailman/listinfo/fis>


 
-- 

4 Austin Dr. Prior Park St. James, Barbados BB23004
Tel:   246-421-8855
Cell:  246-243-5938

_______________________________________________
Fis mailing list
Fis@listas.unizar.es <mailto:Fis@listas.unizar.es>
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis 
<http://listas.unizar.es/cgi-bin/mailman/listinfo/fis>

_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to