Re: [Fis] Fw: Responses

2014-01-22 Thread Loet Leydesdorff
Dear colleagues, 

This discussion and reading the beautiful book of Bob Logan entitled What
is information? (shortly forthcoming) made me go back to reading MacKay
(1969) once more. I cannot find the distinction that makes a difference as
it is quoted by Floridi (2005) -- and thereafter repeated by many -- so that
I think that the honour goes to Bateson (1973) for a difference which makes
a difference. MacKay, however, makes the point, for example, on p. 136 that
the sentence S is a source of information is incomplete. It must always
be completed (even if sometimes implicitly) in the form 'S is a source of
information to receiver R'. Two sentences later he calls this significant
information that must be capable of embodying and abiding by an agreed
code or symbolic calculus.

Elsewhere, he distinguishes this substantive concept of information from
amounts of information that can be measured using (Shannon's) information
theory. It seems to me that any discourse (physics, biology, psychology,
sociology, etc.) can be further informed specifically in terms that are
defined within and relevant to the specific discourse. This accords with the
intuitive sense of information as meaningful information: meaningful for a
discourse.

Shannon's definition of information is counter-intuitive, but it provides us
with a calculus that has major advantages. Katherine Hayles suggested that
the two concepts can be compared with the discussion of whether a glass is
half-full or half-empty. A Chinese colleague (Wu Yishan) once told me that
in Chinese one has two words: sjin sji and tsin bao which correspond
respectively to Shannon's and Bateson's definitions of information.

A substantive definition of information (e.g., as a distinction that makes a
difference for a receiver) requires the specification of the concept in a
theory about the receiving system. This definition is therefore a priori
system-specific; for example, for some of us this system is physics; for
others it is biological discourse. At this level, one can again abstract
from the substance and use Shannon's IT as entropy statistics. Sometimes,
this allows us to explore the use of algorithms developed in one field
(e.g., biology) in another (e.g., sociology). Concepts such as autopoiesis
or auto-catalysis have carried these functions.

For example, in the context of Ascendency Theory, Bob Ulanowicz showed how
one can use the mutual information in three dimensions as an indicator of
systemness. I use that as a systems indicator when operationalizing the
triple helix of university-industry-government relations. Such translations
of metaphors are always in need of further elaboration because the
theoretical context changes and thus the specification of what the
information means. However, the advantage to be able to measure in bits
(nats or dits) frees us from the philosophical confusion about what
information is. 

In my opinion, information can only be defined within a discourse. The
mathematical definition of Shannon has specific functions which enable us to
combine with different discourses (among which, specifically physics since S
= k(B)*H). H, however, is dimensionless and defined as the expected
information content of a message *before* it is received. It is yet to be
provided with meaning. One could consider this meaninglesness as the
specific difference of a mathematical concept of information. (Perhaps, it
is easier to use uncertainty for this mathematical concept.)

Best wishes,
Loet

-Original Message-
From: fis-boun...@listas.unizar.es [mailto:fis-boun...@listas.unizar.es] On
Behalf Of Robert E. Ulanowicz
Sent: Tuesday, January 21, 2014 8:45 PM
To: Christophe
Cc: fis@listas.unizar.es
Subject: Re: [Fis] Fw: Responses


 The reason of being of information, whatever its content or quantity, 
 is to be used by an agent (biological or artificial).

Dear Christophe,

In making this restriction you are limiting the domain of information to
communication and excluding all information that inheres in structure
per-se. John Collier has called the latter manifestation enformation, and
the calculus of IT is quite effective in quantifying its extent.
Perhaps John would like to comment?

Cheers,
Bob U.


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: Responses

2014-01-22 Thread John Collier


At 09:45 PM 2014-01-21, Robert E. Ulanowicz wrote:
 The reason of being of
information, whatever its content or quantity, is
 to be used by an agent (biological or artificial).
Dear Christophe,
In making this restriction you are limiting the domain of information
to
communication and excluding all information that inheres in
structure
per-se. John Collier has called the latter manifestation
enformation,
and the calculus of IT is quite effective in quantifying its extent.
Perhaps John would like to comment?
I developed this concept in order to reply to Jeff Wicken's complaint
that Brooks and Wiley did not distinguish properly between the complement
of entropy and structural information, but I used it in print to discuss,
in the context of cognitive science and especially John Perry's use of
information (see Barwise and Perry Situations and Attitudes and
his What is information?, as well as Dretske's book on information and
perception) what the world must be like in order to make sense of
information coming from the world into our brains. The article can be
found at
Intrinsic
Information (1990) In P. P. Hanson (ed) Information, Language and
Cognition: Vancouver Studies in Cognitive Science, Vol. 1 (originally
University of British Columbia Press, now Oxford University Press, 1990):
390-409. Details about information are there, but the gist of it is that
can be measured, is unique, and depends on time scale to distinguish it
from informational entropy in information systems. The uniqueness
hypothesis was developed very carefully in my former student, Scott
Muller's PhD thesis, published as Asymmetry: The Foundation of
Information (The Frontiers Collection) by Springer in 2007.
I am rather busy now at a conference, or else I would say more
here.
John




Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: Responses

2014-01-22 Thread Christophe
Dear Bob U,
If your are talking about resident information, as available for usage, I take 
it as being part of information that can be used by the agent.
Let me go through John's paper (thanks John). 
Best
Christophe
 
 Date: Tue, 21 Jan 2014 14:45:15 -0500
 Subject: Re: [Fis] Fw:  Responses
 From: u...@umces.edu
 To: christophe.men...@hotmail.fr
 CC: lo...@physics.utoronto.ca; fis@listas.unizar.es
 
 
  The reason of being of information, whatever its content or quantity, is
  to be used by an agent (biological or artificial).
 
 Dear Christophe,
 
 In making this restriction you are limiting the domain of information to
 communication and excluding all information that inheres in structure
 per-se. John Collier has called the latter manifestation enformation,
 and the calculus of IT is quite effective in quantifying its extent.
 Perhaps John would like to comment?
 
 Cheers,
 Bob U.
 
 
  ___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: Responses

2014-01-21 Thread Robert E. Ulanowicz

 The reason of being of information, whatever its content or quantity, is
 to be used by an agent (biological or artificial).

Dear Christophe,

In making this restriction you are limiting the domain of information to
communication and excluding all information that inheres in structure
per-se. John Collier has called the latter manifestation enformation,
and the calculus of IT is quite effective in quantifying its extent.
Perhaps John would like to comment?

Cheers,
Bob U.


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: Responses

2014-01-16 Thread Bob Logan
Hi Christophe - I enjoyed your response - full of meaningful information - :-)  
Your point is well taken. I agree what might be meaningful information for one 
agent might be meaningless for another. I can add another example to your list 
of examples which I encountered some time ago. An author whose name I forget 
pointed out that a book written in Urdu is information for a literate Urdu 
speaker but perhaps not for those that cannot read Urdu. According to the 
definitions of Doug MacKay in 1969 and Gregory Bateson in 1973 'information is 
a distinction that makes a difference' and 'information is a difference that 
makes a difference' respectively. Meaningless information does not cut it by 
their definitions as it does not make a difference. Of course one could define 
information as a distinction or a difference that has the potential to make a 
difference for some agent. It seems to me that defining what is information is 
no easy task. My conclusion from our discussion is that depending on how you 
define information context can be an important part of what is information. The 
notion of information is extremely nuanced with multiple meanings and we seem 
to have only one word for it as pointed out by Shannon himself. In the abstract 
to his paper, The Lattice Theory of Information Shannon (1953) wrote, The word 
information has been given many different meanings by various writers in the 
general field of information theory. It is likely that at least a number of 
these will prove sufficiently useful in certain applications to deserve further 
study and permanent recognition. It is hardly to be expected that a single 
concept of information would satisfactorily account for the numerous possible 
applications of this general field. The present note outlines a new approach to 
information theory, which is aimed specifically at the analysis of certain 
communication problems in which there exist a number of information sources 
simultaneously in operation.

MacKay made a distinction between 'selective information' as defined by 
Shannon's formula and 'structural information', which indicates how 'selective 
information' is to be interpreted.

Structural information must involve semantics and meaning if it is to succeed 
in its role of interpreting selective or Shannon information. Structural 
information is concerned with the effect and impact of the information on the 
mind of the receiver and hence is reflexive. Structural information has a 
relationship to pragmatics as well as semantics where pragmatics tries to 
bridge the explanatory gap between the literal meaning of a sentence and the 
meaning that the speaker or writer intended. Shannon information has no 
particular relation to either semantics or pragmatics. It is only concerned 
with the text of a message and not the intentions of the sender or the possible 
interpretations of the receiver (Logan 2014).

The above material is from my book What is Information? to be published 
simultaneously as a printed book and an e-book by DEMO press in Toronto. I 
would be happy to share this with you as a PDF, Christophe or any other reader 
that finds the above information meaningful and interesting.

Thank you Christophe for providing me with the opportunity to muse so more 
about the meaning of information especially 'meaningless information'. I quite 
enjoy going down the rabbit hole.  all the best Bob
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan


On 2014-01-16, at 5:52 AM, Christophe wrote:

 Dear Bob, 
 Thanks for your answer. 
 So for you, information is always meaningful.
 Such a statement is surprising when many examples can display cases of 
 meaningless information. 
 The well known Chinese Room Argument: a  sentence written in Chinese  is 
 meaningless to a non Chinese speaking reader.
 A vervet monkey alarm is meaningful information for other vervet monkeys but 
 is meaningless for a passing by dog. 
 Cyphered information is meaningful or meaningless depending if the receiver 
 has the cyphering key or not.
 We can agree that the meaning of information does not exist by itself but is 
 a result of an interpretation by a system. The interpretations of given 
 information can deliver different meanings and no meaning is a possible 
 outcome.  Looking at downgrading to signals the intrerpretation of 
 information is surprising.
 Now, regarding information theory, I still understand it in a scientific 
 background as it is part of mathematics and computing. Things can be 
 different indeed if you look at applications of IT (to linguistics, 
 psychology, …). 
 But the key point may be that disregarding the possibility for meaningless 
 information shuts a road in analyzing the possibilities for computers to 
 understand us. A lot is still to be done in this area and my take is that 
 using the 

Re: [Fis] FW: Responses

2014-01-12 Thread Robert E. Ulanowicz
Dear Christophe,

I tried to qualify my use of meaning, but perhaps I wasn't clear enough.

In my example I wanted to say that I(A;B) is a quantity that can be
considered a proto-meaning of B to A. Another way of saying the same
thing is that I(A;B) quantifies A in the context of B.

I should have added that I don't confine my notion of information to the
scenario of communication. I feel that it's discovery in that context was
an historical accident. Rather, like Stan, I  consider information more
generally as constraint, and the information of communication becomes a
subset of the more general attribute.

Hence, anything that is held together by constraints is amenable in one
form or another to the Bayesian forms of Shannon capacity. The real
advantage in doing so is that the complement of the Bayesian information
is made explicit. Vis-a-vis constraint this complement becomes
flexibility. Such flexibility is an apophasis that is missing from most
scientific endeavors, but is essential to our understanding of evolution.

You are probably correct that my terminology is not orthodox and possibly
confusing to some. But I see such inconvenience as a small price to pay
for opening up a new window on quantitative evolutionary theory. I really
want folks to think outside the box of communication theory. What Shannon
started many (such as Terry Deacon) have prematurely cast aside. My
message is that we need to re-evaluate Shannon-type measures in their
Bayesian contexts. The have the potential of becoming very powerful
quantitative tools. (E.g.,
http://www.cbl.umces.edu/~ulan/pubs/EyesOpen.pdf.)

Peace,
Bob


 Bob,

 You seem to implicitly consider all information as being meaningful.

 I'm afraid such a position is source of confusion for a scientific
 approach to information.

 As we know, Shannon defined a quantity of information to measure the
 capacity
 of the channel carrying it. He did not address the “meaning” that the
 information may carry (as you write in your paper “Shannon information
 considers the amount of information,
 nominally in bits, but is devoid of semantics” ). Today DP  Telecom
 activities use Shannon type information without referring to its meaning.
 Considering “information” as being meaningful looks to me as potentially
 misleading and source of misunderstandings in our discussions. Information
 can
 be meaningful or meaningless. The meaning comes from the system that
 manages
 the information.

 Some research activities explicitly consider information as
 meaningful.data (see http://www.mdpi.org/entropy/papers/e5020125.pdf). I'm
 afraid such a position creates
 some vocabulary problems if we want to keep a scientific background when
 trying
 to understand what is “meaning”.

 The meaning of information is not something that exists by itself. The
 meaning
 of information is related to the system that manages the information
 (creates
 it or uses it). And the subject has to be addressed explicitly as systems
 can
 be animals, humans or machines. Focusing on meaning generation by a system
 can bring
 some clarification (see http://www.mdpi.org/entropy/papers/e5020193.pdf).

 Hope this helps

 Christophe





 From: lo...@physics.utoronto.ca
 Date: Sat, 11 Jan 2014 10:49:28 -0500
 To: gordana.dodig-crnko...@mdh.se
 CC: fis@listas.unizar.es
 Subject: Re: [Fis] Fw:  Responses

 Dear Friends - I have been lurking as far as the discussion of Shannon
 information is concerned. I must confess I am a bit confused by the use of
 the term information associated with what you folks call Shannon
 information. To my way of thinking Shannon produced a theory of signals
 and not information. Signals sometimes contain information and sometimes
 they do not which is exactly what Shannon said of his notion of
 information. Here is what troubles me: A set of random numbers according
 to Shannon has more information than a structured set of numbers like the
 set of even numbers. For me a random set of numbers contains no
 information. Now I am sure you will agree with me that DNA contains
 information and certainly more information than a soup of random organic
 chemicals which seems to contradict Shannon's definition of information. I
 would appreciate what any of you would make of this argument of mine. Here
 is another thought about information that I would love to have some
 comments on. The information in this email post to you folks will appear
 on multiple computers, it might be converted into ink on paper, it might
 be read aloud. The information is not tied to any particular physical
 medium. But my DNA cannot be emailed, printed out or in anyway separated
 from the physical medium in which it is instantiated. As far as I know it
 has been transferred in part to my 4 kids and 4 grandkids so far and there
 it stops for time being at least. The information in my DNA cannot be
 separated from the medium in which it is instantiated. This information is
 not symbolic. DNA is not a symbol of RNA

[Fis] Fw: Responses

2014-01-11 Thread Joseph Brenner
Dear Hans and All,

This is a very useful form of responses, which enables further directed 
comments. I start with Lars', which is perhaps as Hans says crucial:

Lars -- How does QBism differ from Copenhagen? This is  a crucial question.  It 
differs not at all in the formalism, and only subtly in the interpretation.  
Many users of quantum mechanics regard the wavefunction as a real property of 
an electron. They talk about the wavefunction in the same way you might say 
the speed of the car. They must then deal with perennial problems such as 
action-at-a-distance and the collapse of the wavefunction.  QBists regard the 
wavefunction the way Bruno de Finetti regarded probability, when he wrote, in 
caps, PROBABILITY DOES NOT EXIST.  I think  he meant that the probability of a 
coin falling heads is not a measurable property of a coin. All it is is a 
personal belief of how much an agent should bet. And that belief changes 
instantly and locally when you make a measurement, or hear that someone else 
has made one.  

Joseph (New) -- One does not learn much about the way things are by reference 
to simple, binary phenomena (coins) or more complicated versions in game theory 
(profit-loss). All these have little to do with processes, such as information, 
which embody complex oscillations between presence and absence, non-meaning and 
meaning, and so on.

Some people call the the Copenhagen wavefunction ontic, the QBist one epistemic.

Joseph (New) -- This is an extremely important statement by Hans whose 
consequences, IMHO, should be discussed as they relate to information. We all 
agree that one cannot surf on ontic Copenhagen waves, while we 'know what we 
know'. But ontic positions, when some of the deficiencies of the original 
Copenhagen interpretation are corrected, have a lot to say about /how/ we know, 
what the properties of what we know are, and how the two are interrelated. 
Rather than saddle QBism with an ignorantist position, I would like to see it 
expand to include this relation.

Gordana -- I am out of my depth in a discussion of 
phenomena/noumena/Dinge-an-sich. But when I agree that the Higgs exists out 
there in the world, I am sure it's not an object like a marble, but a symbol 
for a collection of experiences that many people have had, and have discussed, 
and codified, so that if they perform another experiment where it might play a 
role, they can be prepared with betting odds for what they might experience 
next. 

Joseph (New) - - That it is a collection of experiences does not exclude that 
it is an object, or better process, of a kind other than a 'marble'. As such, 
in discussing it, we can go beyond  binary game metaphors.


Joseph --   the electron is a point means that no experiment to date has 
found evidence for a finite size.  In the theory (quantum electrodynamics) 
there is no room for any parameter with dimensions of length, although there 
are mass, charge, spin, and  magnetic moment. When you introduce a finite size 
into the theory, it makes wrong predictions. (This is not true for protons, for 
example.)

Joseph (New) -- Same as above. The fact that an electron does not have a 
'finite size (diameter)' does not mean that is does not exist objectively. It 
is fuzzy with a 'size' greater than the Planck length and less than that of a 
hadron.


The gravitational field lives in 3D was not supposed to deny that Einstein's 
elegant formulation treats time as a  fourth dimension.  But a quantum field 
is an altogether different and much more complicated beast which lives in 
infinite dimensions, and has no analog whatever in our everyday human world. 

Joseph (New) -- In my opinion, 1) current theories of gravity add more to 
Einstein's original number of four dimensions to the gravitational field; 2) 
current quantum theories do not saddle the quantum field with a mathematical 
infinity. There are no such infinities in nature. That there is no analog of 
the quantum field at the macroscopic level does not mean that there are no 
isomorphisms between levels. One aspect is that of the couple duality - 
self-duality as I mentioned earlier.  


Having a proper view of physics among the many possible is critical to placing 
information theory on a sound basis. I have proposed Logic in Reality as one 
way of giving meaning to the statement that energy and information processes 
are non-separably related and how they are related. Are there others?

Thank you and best regards,

Joseph___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: Responses

2014-01-11 Thread Loet Leydesdorff
Having a proper view of physics among the many possible is critical to
placing information theory on a sound basis. I have proposed Logic in
Reality as one way of giving meaning to the statement that energy and
information processes are non-separably related and how they are related.
Are there others?

 

Dear Joseph, 

 

It seems to me that there is at least one alternative: Shannon's
mathematical theory of information. Information is then defined as
content-free. Thermodynamic entropy (physics) is the special case that H is
multiplied by the Boltzmann constant and thus one obtains the dimensionality
of S. (S = kB * H). Information theory, however, can also be used in other
contexts such as economics (Theil, 1972). It does not have a realistic
interpretation such as in your argument.

 

From such a more differentiated perspective, concepts (e.g., electrons) do
not exist, but are meaningful and codified within theories. There can be
sufficient evidence (in physical theorizing) to assume (for the time being)
that the external referents (electrons) exist. The logic is not in reality,
but in the argument, and one cannot jump to the (ontic) conclusion of
existence.

 

Thus, perhaps the sentence we all agree . (with you?) is a bit premature. 

 

Best, 

Loet

 

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis