Re: [Fis] What is information? and What is life?

2016-12-19 Thread Bob Logan
Dear Dick - I loved your analysis. You are right on the money. It also explains 
why Shannon dominated the field of information. He had a mathematical formula 
and there is nothing more appealing to a scientist than a mathematical formula. 
But you are right his formula only tells us of how many bits are needed to 
represent some information but tells us nothing about its meaning or its 
significance. As Marshall McLuhan said about Shannon information it is figure 
without ground. A figure only acquires meaning when one understands the ground 
in which it operates. So Shannon’s contribution to engineering is excellent but 
it tells us nothing about its nature or its impact as you wisely pointed out. 
Thanks for your insight.

I would like to refer to your insight the next time I write about info and want 
to attribute you correctly. Can you’ll me a bit about yourself like where you 
do your research. thanks - Bob Logan


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications


On Dec 19, 2016, at 6:48 AM, Dick Stoute  wrote:

List,

Please allow me to respond to Loet about the definition of information stated 
below.  

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)
 
I agree.  I struggled with this definition for a long time before realising 
that Shannon was really discussing "amount of information" or the number of 
bits needed to convey a message.  He was looking for a formula that would 
provide an accurate estimate of the number of bits needed to convey a message 
and realised that the amount of information (number of bits) needed to convey a 
message was dependent on the "amount" of uncertainty that had to be eliminated 
and so he equated these.  

It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of water in 
liters, but this does not tell us what water is and likewise the measure we use 
for "amount of information" does not tell us what information is. We can, for 
example equate the amount of water needed to fill a container with the volume 
of the container, but we should not think that water is therefore identical to 
an empty volume.  Similarly we should not think that information is identical 
to uncertainty.

By equating the number of bits needed to convey a message with the "amount of 
uncertainty" that has to be eliminated Shannon, in effect, equated opposites so 
that he could get an estimate of the number of bits needed to eliminate the 
uncertainty.  We should not therefore consider that this equation establishes 
what information is. 

Dick


On 18 December 2016 at 15:05, Loet Leydesdorff mailto:l...@leydesdorff.net>> wrote:
Dear James and colleagues,

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8)

 

The definition of information as relevant for a system of reference confuses 
information with "meaningful information" and thus sacrifices the surplus value 
of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as

 

physical interactions such photons stimulating the retina of the eye, 
human-machine interactions (this is the level that Shannon lives on), 
biological interaction such body temperature relative to touch ice or heat 
source, social interaction such as this forum started by Pedro, economic 
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Otherwise, one cannot measure meaningful 
information. In a previous series of communications we discussed redundancy 
from this perspective.

 

Lerner introduces mathematical expectation E[Sap] (difference between of a 
priory entropy [sic] and a posteriori entropy), which is distinguished from the 
notion of relative information Iap (Learner, page 7).

 

) expresses in bits of information the information generated when the a priori 
distribution is turned into the a posteriori one . This follows within the 
Shannon framework without needing an observer. I use this equation, for 
example, in my 1995-book The Challenge of Scientometrics (Chapters 8 and 9), 
with a reference to Theil (1972). The relative information is defined as the 
H/H(max).

 

I agree that the intuitive notion of information is derived from the Latin 
“in-formare” (Varela, 1979). But most of us do no longer use “force” and “mass” 
in the intuitive (Aristotelian) sense. J The proliferation of the meanings of 
information if confused with “meaningful information” is indicative for an 
“

Re: [Fis] What is information? and What is life?

2016-12-19 Thread Karl Javorszky
What is Information?



Once more, Occam and the numbers give a simple, short and concise
explanation. (There is more text and a formal definition of information in
my book “Natural Orders” ISBN: 9783990571378.)



The root of the term “information” is in the concept of order. The idea of
order can be assumed axiomatic for people who are interested in the
definition of information. For those who need a deictic definition of the
term “order”: take n (n > 3) objects. One can use teddy-bears, shoes,
pieces of paper, numbers, whatever. We sort the objects. The sequence that
arrives after we have ordered the objects is the deictic definition of the
term “order”. This might sound in deviance to the use of the term in
mathematics; to reconcile the two concepts, we point out that the
traditional definition in mathematics refers to order as a potential,
realisable property of the collection, while here we speak of order as a
realised instance of the general faculty of the objects to be in order. The
distinction is always clear from the context. A collection that shows a
sequence of its elements is an ordered collection.



>From the order to the information:

Whichever order exists, it has alternatives and a background. The
alternatives are those variants of the order which are not realised, the
background is that state of the world about which we cannot say anything
definite.

To repeat:

N distinguishable objects have n! possible permutations. We take one
specific of the permutations. In this permutation, a1 is on place p1. The
alternatives are those permutations, where a1 is not on place p1. The
background are those permutations where a2 a3, etc… can be on places px,
py, pz, etc….



Novelty:

We construct also such logical sentences which state something that is not
the case. These are false logical statements. Traditionally, one does not
use logical sentences that are false. This is not a rule given by a Supreme
Logician from Heaven, but a convention of convenience. It would have been
inconceivable to write up all those results of a false multiplication table
that are not correct, besides the conceptual aversion against doing so.

Now we have computers. These can register all that what is not the case, as
long as we restrict ourselves to using rather few logical words while we
utter a logical sentence. The brambling of an infant can be of a high
scientific value, in case one wants to learn, how the achievement of
language progresses during infancy. We now do not care, whether the
sentence is logically true or not, as long as it is grammatically correct.
We simply write up all possible sentences that a child can express. Among
these, there are such in which the child states something correctly (e.g.
recognises and names a toy), such where the child names an alternative
(e.g. calls a doll a ball) and such which are the background, neither
surely true, nor surely false (e.g. the child calls something a cluxtli and
we do not know what the child has been looking at in that moment). We
investigate all three aspects of the order: the actual sequence, its
excluded alternatives and the background to these.



Resume:

Traditionally, tertium non datur, therefore that what is not the case is
defined simply as .not. .t. = .false. Now we have a more complicated logic,
and .unknown. is also permitted.

After the unknown has softened up the trivial and lazy definition of “what
is not true is false and we do not speak false”, there is a need for a word
to describe that what is not true, but is a background to that what is
true. Against this background, one may recognise the shadow of what is the
case: this is what is definitely not the case.



Information:

Information is that what we do know AND do not know about the state of the
world. (Footnote to “The world is everything that is the case”: “The
alternatives to that what is the case are described by sentences about that
what can not be the case and the background to that what is the case is
described by sentences that state that what can be the case”. And to “About
that what is not the case, one should keep one’s silence” the following:
“unless and until one has found a way not to think with one’s own brain”.)

Traditionally, that what is not the case has been seen as one solid logical
entity, defined by – well, by that that it is not the case, as a contrast
to that what reasonable people can speak reasonably about. Computers allow
us to slice thinly between layers of what is not the case. That what oozes
out is information.

The relation among that what is the case, that what this implicates and
that what is not affected by that what is the case needs some new words to
allow precision in thinking. One of the words that are available is
“information”. Scientists have always supposed that there is something
hidden, yet obviously at work, behind symbols, bits and logical statements.
Forcing this oyster open shows its inner life. The interdependence between
what can be the c

Re: [Fis] Fwd: What is life?

2016-12-19 Thread Mark Johnson
Dear Bob (Ulanowicz),

I hope I didn't come across as flippant about the political situation.
The world is obviously in a very frightening and perilous state at the
moment.

Regarding ecology, Shannon and IT, the common denominator is
"counting". This is far from trivial. The disastrous economic policies
which have delivered inequality and austerity... along with Brexit and
Trump... have relied on approaches to measurement and information
which we must now question. In my understanding of your work in
ecology, you count event regularities in the ecosystem.

A change to our understanding of number and counting would change the
way we see the world in a fundamental way (the recent discussion about
Joe Brenner's work on Lupasco is fascinating and it's my Christmas job
to dig into it). There's something important about a logic which
transcends binary distinctions (Spencer Brown and Lou Kauffman,
category theorists, etc all seem to be poking at this).

I remain optimistic. Governor Jerry Brown (Bateson student) gave a
wonderful defiant speech a couple of days ago. "sometimes people need
to have a heart attack to get them to stop smoking. We've just had a
heart attack."

Some fundamental root and branch rethinking is required.

Best wishes,

Mark

On 18 December 2016 at 23:34, Robert E. Ulanowicz  wrote:
>> Thank you Bob!
>>
>> The medium is a very restricted form of communication on the internet, of
>> course...
>>
>> Are our circular deliberations about information victims of the so-called
>> "information technology" which enables them? Is this a variety of
>> Wittgenstein's realisation that the problems of philosophy were problems
>> of language? Perhaps we cannot see the constraints that communications
>> technology itself has on our discourse. How might we try to see them?
>
> Mark, I have long argued against identifying IT with communications
> theory. You're right, doing so does place needless restrictions on the
> discourse. As an ecologist, I use IT to quantify constraint and freedom
> inherent in ecosystem trophic webs, which has nothing to do with
> communication theory. In fact the whole discipline can be treated as
> homologous to probability theory in total abstraction of communications.
>
>> Maybe this goes some way towards accounting for the strange political
>> situation we find ourselves in at the moment!
>
> Almost everyone I know is feeling depressed and dreadful in anticipation
> of what will happen after Jan 20. The feeling is that the Republic is very
> much at risk.
>
>> Best wishes,
>>
>> Mark
>
> Cheers,
> Bob U.
>



-- 
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What is information? and What is life?

2016-12-19 Thread Dick Stoute
List,

Please allow me to respond to Loet about the definition of information
stated below.

1. the definition of information as uncertainty is counter-intuitive
("bizarre"); (p. 27)



I agree.  I struggled with this definition for a long time before realising
that Shannon was really discussing "amount of information" or the number of
bits needed to convey a message.  He was looking for a formula that would
provide an accurate estimate of the number of bits needed to convey a
message and realised that the amount of information (number of bits) needed
to convey a message was dependent on the "amount" of uncertainty that had
to be eliminated and so he equated these.


It makes sense to do this, but we must distinguish between "amount of
information" and "information".  For example, we can measure amount of
water in liters, but this does not tell us what water is and likewise the
measure we use for "amount of information" does not tell us what
information is. We can, for example equate the amount of water needed to
fill a container with the volume of the container, but we should not think
that water is therefore identical to an empty volume.  Similarly we should
not think that information is identical to uncertainty.


By equating the number of bits needed to convey a message with the "amount
of uncertainty" that has to be eliminated Shannon, in effect, equated
opposites so that he could get an estimate of the number of bits needed to
eliminate the uncertainty.  We should not therefore consider that this
equation establishes what information is.


Dick


On 18 December 2016 at 15:05, Loet Leydesdorff  wrote:

> Dear James and colleagues,
>
>
>
> Weaver (1949) made two major remarks about his coauthor (Shannon)'s
> contribution:
>
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
> 2. "In particular, information must not be confused with meaning." (p. 8)
>
>
>
> The definition of information as relevant for a system of reference
> confuses information with "meaningful information" and thus sacrifices the
> surplus value of Shannon's counter-intuitive definition.
>
>
>
> information observer
>
>
>
> that integrates interactive processes such as
>
>
>
> physical interactions such photons stimulating the retina of the eye,
> human-machine interactions (this is the level that Shannon lives on),
> biological interaction such body temperature relative to touch ice or heat
> source, social interaction such as this forum started by Pedro, economic
> interaction such as the stock market, ... [Lerner, page 1].
>
>
>
> We are in need of a theory of meaning. Otherwise, one cannot measure
> meaningful information. In a previous series of communications we discussed
> redundancy from this perspective.
>
>
>
> Lerner introduces mathematical expectation E[Sap] (difference between of a
> priory entropy [sic] and a posteriori entropy), which is distinguished from
> the notion of relative information Iap (Learner, page 7).
>
>
>
> ) expresses in bits of information the information generated when the a
> priori distribution is turned into the a posteriori one . This follows
> within the Shannon framework without needing an observer. I use this
> equation, for example, in my 1995-book *The Challenge of Scientometrics*
> (Chapters 8 and 9), with a reference to Theil (1972). The relative
> information is defined as the *H*/*H*(max).
>
>
>
> I agree that the intuitive notion of information is derived from the Latin
> “in-formare” (Varela, 1979). But most of us do no longer use “force” and
> “mass” in the intuitive (Aristotelian) sense. J The proliferation of the
> meanings of information if confused with “meaningful information” is
> indicative for an “index sui et falsi”, in my opinion. The repetitive
> discussion lames the progression at this list. It is “like asking whether a
> glass is half empty or half full” (Hayles, 1990, p. 59).
>
>
>
> This act of forming forming an information process results in the
> construction of an observer that is the owner [holder] of information.
>
>
>
> The system of reference is then no longer the message, but the observer
> who provides meaning to the information (uncertainty). I agree that this is
> a selection process, but the variation first has to be specified
> independently (before it can be selected.
>
>
>
> And Lerner introduces the threshold between objective and subjective
> observes (page 27).   This leads to a consideration selection and
> cooperation that includes entanglement.
>
>
>
> I don’t see a direct relation between information and entanglement. An
> observer can be entangled.
>
>
>
> Best,
>
> Loet
>
>
>
> PS. Pedro: Let me assume that this is my second posting in the week which
> ends tonight. L.
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>


-- 

4 Austin Dr. Prior Park St. James, Barbados BB23004
Tel:   246-421-8