Re: [Fis] What is information? and What is life?

2017-01-11 Thread Christophe
Dear Terry,
Are you really sure that looking at linking Shannon to higher-order conceptions 
of information like meaning is a realistic ambition?
I compare that to linking the width of a street to the individual motivations 
of the persons that will walk in the street.
As we know, Shannon is to measure a communication channel capacity. It is not 
about the possible meanings of the information that may transit through the 
channel.
Information goes through a communication channel because agents want to 
communicate, to exchange meaningful information (the 'outside perspective' as 
you say). And meanings do not exist by themselves. Meaningful information are 
generated by agents that have reasons for that. Animals manage meanings in 
order to stay alive (as individual & as species). Human motivation/constraints 
are more complex but they are the sources of our meaning generations.
We agree that information is not to be confused with meaning. However, on a 
pragmatic standpoint the two cannot be separated. But this does not imply, I 
feel, that Shannon is to be linked to the meaning of information.
For me the core of the subject is with meaning generation. Why and how is 
meaningful information generated? (https://philpapers.org/rec/MENCOI)

All the best to all for 2017.
Christophe


De : Fis  de la part de Terrence W. DEACON 

Envoyé : samedi 7 janvier 2017 20:15
À : John Collier
Cc : Foundations of Information Science Information Science; Dai Griffiths
Objet : Re: [Fis] What is information? and What is life?

Leot remarks:

"... we need a kind of calculus of redundancy."

I agree whole-heartedly.

What for Shannon was the key to error-correction is thus implicitly normative. 
But of course assessment of normativity (accurate/inacurate, useful/unuseful, 
significant/insignificant) must necessarily involve an "outside" perspective, 
i.e. more than merely the statistics of sign medium chartacteristics. 
Redundancy is also implicit in concepts like communication, shared 
understanding, iconism, and Fano's "mutual information." But notice too that 
redundancy is precisely non-information in a strictly statistical understanding 
of that concept; a redundant message is not itself "news" — and yet it can 
reduce the uncertainty of what is "message" and what is "noise." It is my 
intuition that by developing a formalization (e.g. a "calculus") using the 
complemetary notions of redundancy and constraint that we will ultimately be 
able formulate a route from Shannon to the higher-order conceptions of 
information, in which referential and normative features can be precisely 
formulated.

There is an open door, though it still seems pretty dark on the other side. So 
one must risk stumbling in order to explore that space.

Happy 2017, Terry

On Sat, Jan 7, 2017 at 9:02 AM, John Collier 
> wrote:
Dear List,

I agree with Terry that we should not be bound by our own partial theories. We 
need an integrated view of information that shows its relations in all of its 
various forms. There is a family resemblance in the ways it is used, and some 
sort of taxonomy can be constructed. I recommend that of Luciano Floridi. His 
approach is not unified (unlike my own, reported on this list), but compatible 
with it, and is a place to start, though it needs expansion and perhaps 
modification. There may be some unifying concept of information, but its 
application to all the various ways it has been used will not be obvious, and a 
sufficiently general formulation my well seem trivial, especially to those 
interested in the vital communicative and meaningful aspects of information. I 
also agree with Loet that pessimism, however justified, is not the real 
problem. To some extent it is a matter of maturity, which takes both time and 
development, not to mention giving up cherished juvenile enthusiasms.

I might add that constructivism, with its positivist underpinnings, tends to 
lead to nominalism and relativism about whatever is out there. I believe that 
this is a major hindrance to a unified understanding. I understand that it 
appeared in reaction to an overzealous and simplistic realism about science and 
other areas, but I think it through the baby out with the bathwater.

I have been really ill, so my lack of communication. I am pleased to see this 
discussion, which is necessary for the field to develop maturity. I thought I 
should add my bit, and with everyone a Happy New Year, with all its 
possibilities.

Warmest regards to everyone,
John

From: Fis 
[mailto:fis-boun...@listas.unizar.es] On 
Behalf Of Loet Leydesdorff
Sent: December 31, 2016 12:16 AM
To: 'Terrence W. DEACON' >; 
'Dai Griffiths' >; 
'Foundations of Information Science 

Re: [Fis] What is information? and What is life?; towards a calculus of redundancy

2017-01-10 Thread Loet Leydesdorff
Toward a Calculus of Redundancy:   
The feedback arrow of expectations in knowledge-based systems

Loet Leydesdorff, Mark W. Johnson, Inga Ivanova 

(Submitted on 10 Jan 2017; https://arxiv.org/abs/1701.02455 )

 

Whereas the generation of Shannon-type information is coupled to the second law 
of thermodynamics, redundancy--that is, the complement of information to the 
maximum entropy--can be increased by further distinctions: new options can 
discursively be generated. The dynamics of discursive knowledge production thus 
infuse the historical dynamics with a cultural evolution based on expectations 
(as different from observations). We distinguish among (i) the communication of 
information, (ii) the sharing of meaning, and (iii) discursive knowledge. 
Meaning is provided from the perspective of hindsight as feedback on the 
entropy flow and thus generates redundancy. Specific meanings can selectively 
be codified as discursive knowledge; knowledge-based reconstructions enable us 
to specify expectations about future states which can be invoked in the 
present. The cycling among the dynamics of information, meaning, and knowledge 
in feedback and feedforward loops can be evaluated empirically: When mutual 
redundancy prevails over mutual information, the sign of the resulting 
information is negative indicating reduction of uncertainty because of new 
options available for realization; innovation can then be expected to flourish. 
When historical realizations prevail, innovation may be locked-in because of 
insufficient options for further development. 

 

* Comments are very welcome in this stage

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

  l...@leydesdorff.net ;  
 http://www.leydesdorff.net/ 
Associate Faculty,   SPRU, University of Sussex; 

Guest Professor   Zhejiang Univ., Hangzhou; 
Visiting Professor,   ISTIC, Beijing;

Visiting Professor,   Birkbeck, University of London; 

  
http://scholar.google.com/citations?user=ych9gNYJ=en

 

From: Terrence W. DEACON [mailto:dea...@berkeley.edu] 
Sent: Saturday, January 07, 2017 8:15 PM
To: John Collier
Cc: l...@leydesdorff.net; Dai Griffiths; Foundations of Information Science 
Information Science
Subject: Re: [Fis] What is information? and What is life?

 

Leot remarks:

 

"... we need a kind of calculus of redundancy."

 

I agree whole-heartedly. 

 

What for Shannon was the key to error-correction is thus implicitly normative. 
But of course assessment of normativity (accurate/inacurate, useful/unuseful, 
significant/insignificant) must necessarily involve an "outside" perspective, 
i.e. more than merely the statistics of sign medium chartacteristics. 
Redundancy is also implicit in concepts like communication, shared 
understanding, iconism, and Fano's "mutual information." But notice too that 
redundancy is precisely non-information in a strictly statistical understanding 
of that concept; a redundant message is not itself "news" — and yet it can 
reduce the uncertainty of what is "message" and what is "noise." It is my 
intuition that by developing a formalization (e.g. a "calculus") using the 
complemetary notions of redundancy and constraint that we will ultimately be 
able formulate a route from Shannon to the higher-order conceptions of 
information, in which referential and normative features can be precisely 
formulated. 

 

There is an open door, though it still seems pretty dark on the other side. So 
one must risk stumbling in order to explore that space.

 

Happy 2017, Terry

 

On Sat, Jan 7, 2017 at 9:02 AM, John Collier  wrote:

Dear List,

 

I agree with Terry that we should not be bound by our own partial theories. We 
need an integrated view of information that shows its relations in all of its 
various forms. There is a family resemblance in the ways it is used, and some 
sort of taxonomy can be constructed. I recommend that of Luciano Floridi. His 
approach is not unified (unlike my own, reported on this list), but compatible 
with it, and is a place to start, though it needs expansion and perhaps 
modification. There may be some unifying concept of information, but its 
application to all the various ways it has been used will not be obvious, and a 
sufficiently general formulation my well seem trivial, especially to those 
interested in the vital communicative and meaningful aspects of information. I 
also agree with Loet that pessimism, however justified, is not the real 
problem. To some extent it is a matter of maturity, which takes both time and 
development, not to mention giving up cherished juvenile enthusiasms.

 

I might add that 

Re: [Fis] What is information? and What is life?

2017-01-10 Thread Terrence W. DEACON
Leot remarks:

"... we need a kind of calculus of redundancy."

I agree whole-heartedly.

What for Shannon was the key to error-correction is thus implicitly
normative. But of course assessment of normativity (accurate/inacurate,
useful/unuseful, significant/insignificant) must necessarily involve an
"outside" perspective, i.e. more than merely the statistics of sign medium
chartacteristics. Redundancy is also implicit in concepts like
communication, shared understanding, iconism, and Fano's "mutual
information." But notice too that redundancy is precisely non-information
in a strictly statistical understanding of that concept; a redundant
message is not itself "news" — and yet it can reduce the uncertainty of
what is "message" and what is "noise." It is my intuition that by
developing a formalization (e.g. a "calculus") using the complemetary
notions of redundancy and constraint that we will ultimately be able
formulate a route from Shannon to the higher-order conceptions of
information, in which referential and normative features can be precisely
formulated.

There is an open door, though it still seems pretty dark on the other side.
So one must risk stumbling in order to explore that space.

Happy 2017, Terry

On Sat, Jan 7, 2017 at 9:02 AM, John Collier  wrote:

> Dear List,
>
>
>
> I agree with Terry that we should not be bound by our own partial
> theories. We need an integrated view of information that shows its
> relations in all of its various forms. There is a family resemblance in the
> ways it is used, and some sort of taxonomy can be constructed. I recommend
> that of Luciano Floridi. His approach is not unified (unlike my own,
> reported on this list), but compatible with it, and is a place to start,
> though it needs expansion and perhaps modification. There may be some
> unifying concept of information, but its application to all the various
> ways it has been used will not be obvious, and a sufficiently general
> formulation my well seem trivial, especially to those interested in the
> vital communicative and meaningful aspects of information. I also agree
> with Loet that pessimism, however justified, is not the real problem. To
> some extent it is a matter of maturity, which takes both time and
> development, not to mention giving up cherished juvenile enthusiasms.
>
>
>
> I might add that constructivism, with its positivist underpinnings, tends
> to lead to nominalism and relativism about whatever is out there. I believe
> that this is a major hindrance to a unified understanding. I understand
> that it appeared in reaction to an overzealous and simplistic realism about
> science and other areas, but I think it through the baby out with the
> bathwater.
>
>
>
> I have been really ill, so my lack of communication. I am pleased to see
> this discussion, which is necessary for the field to develop maturity. I
> thought I should add my bit, and with everyone a Happy New Year, with all
> its possibilities.
>
>
>
> Warmest regards to everyone,
>
> John
>
>
>
> *From:* Fis [mailto:fis-boun...@listas.unizar.es] *On Behalf Of *Loet
> Leydesdorff
> *Sent:* December 31, 2016 12:16 AM
> *To:* 'Terrence W. DEACON' ; 'Dai Griffiths' <
> dai.griffith...@gmail.com>; 'Foundations of Information Science
> Information Science' 
>
> *Subject:* Re: [Fis] What is information? and What is life?
>
>
>
> We agree that such a theory is a ways off, though you some are far more
> pessimisitic about its possibility than me. I believe that we would do best
> to focus on the hole that needs filling in rather than assuming that it is
> an unfillable given.
>
>
>
> Dear Terrence and colleagues,
>
>
>
> It is not a matter of pessimism. We have the example of “General Systems
> Theory” of the 1930s (von Bertalanffy  and others). Only gradually, one
> realized the biological metaphor driving it. In my opinion, we have become
> reflexively skeptical about claims of “generality” because we know the
> statements are framed within paradigms. Translations are needed in this
> fractional manifold.
>
>
>
> I agree that we are moving in a fruitful direction. Your book “Incomplete
> Nature” and “The Symbolic Species” have been important. The failing options
> cannot be observed, but have to be constructed culturally, that is, in
> discourse. It seems to me that we need a kind of calculus of redundancy.
> Perspectives which are reflexively aware of this need and do not assume an
> unproblematic “given” or “natural” are perhaps to be privileged
> nonetheless. The unobservbable options have first to be specified and we
> need theory (hypotheses) for this. Perhaps, this epistemological privilege
> can be used as a vantage point.
>
>
>
> There is an interesting relation to Husserl’s *Critique of the European
> Sciences* (1935): The failing (or forgotten) dimension is grounded in
> “intersubjective intentionality.” Nowadays, we would call this “discourse”.
> How are 

Re: [Fis] What is information? and What is life?

2017-01-10 Thread John Collier
Dear List,

I agree with Terry that we should not be bound by our own partial theories. We 
need an integrated view of information that shows its relations in all of its 
various forms. There is a family resemblance in the ways it is used, and some 
sort of taxonomy can be constructed. I recommend that of Luciano Floridi. His 
approach is not unified (unlike my own, reported on this list), but compatible 
with it, and is a place to start, though it needs expansion and perhaps 
modification. There may be some unifying concept of information, but its 
application to all the various ways it has been used will not be obvious, and a 
sufficiently general formulation my well seem trivial, especially to those 
interested in the vital communicative and meaningful aspects of information. I 
also agree with Loet that pessimism, however justified, is not the real 
problem. To some extent it is a matter of maturity, which takes both time and 
development, not to mention giving up cherished juvenile enthusiasms.

I might add that constructivism, with its positivist underpinnings, tends to 
lead to nominalism and relativism about whatever is out there. I believe that 
this is a major hindrance to a unified understanding. I understand that it 
appeared in reaction to an overzealous and simplistic realism about science and 
other areas, but I think it through the baby out with the bathwater.

I have been really ill, so my lack of communication. I am pleased to see this 
discussion, which is necessary for the field to develop maturity. I thought I 
should add my bit, and with everyone a Happy New Year, with all its 
possibilities.

Warmest regards to everyone,
John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Loet Leydesdorff
Sent: December 31, 2016 12:16 AM
To: 'Terrence W. DEACON' ; 'Dai Griffiths' 
; 'Foundations of Information Science Information 
Science' 
Subject: Re: [Fis] What is information? and What is life?

We agree that such a theory is a ways off, though you some are far more 
pessimisitic about its possibility than me. I believe that we would do best to 
focus on the hole that needs filling in rather than assuming that it is an 
unfillable given.

Dear Terrence and colleagues,

It is not a matter of pessimism. We have the example of “General Systems 
Theory” of the 1930s (von Bertalanffy  and others). Only gradually, one 
realized the biological metaphor driving it. In my opinion, we have become 
reflexively skeptical about claims of “generality” because we know the 
statements are framed within paradigms. Translations are needed in this 
fractional manifold.

I agree that we are moving in a fruitful direction. Your book “Incomplete 
Nature” and “The Symbolic Species” have been important. The failing options 
cannot be observed, but have to be constructed culturally, that is, in 
discourse. It seems to me that we need a kind of calculus of redundancy. 
Perspectives which are reflexively aware of this need and do not assume an 
unproblematic “given” or “natural” are perhaps to be privileged nonetheless. 
The unobservbable options have first to be specified and we need theory 
(hypotheses) for this. Perhaps, this epistemological privilege can be used as a 
vantage point.

There is an interesting relation to Husserl’s Critique of the European Sciences 
(1935): The failing (or forgotten) dimension is grounded in “intersubjective 
intentionality.” Nowadays, we would call this “discourse”. How are discourses 
structured and how can they be translated for the purpose of offering this 
“foundation”?

Happy New Year,
Loet

My modest suggestion is only that in the absence of a unifying theory we should 
not privilege one partial theory over others and that in the absence of a 
global general theory we need to find terminology that clearly identifies the 
level at which the concept is being used. Lacking this, we end up debating 
incompatible definitions, and defending our favored one that either excludes or 
includes issues of reference and significance or else assumes or denies the 
relevance of human interpreters. With different participants interested in 
different levels and applications of the information concept—from physics, to 
computation, to neuroscience, to biosemiotics, to language, to art, 
etc.—failure to mark this diversity will inevitably lead us in circles.

I urge humility with precision and an eye toward synthesis.

Happy new year to all.\

— Terry

On Thu, Dec 29, 2016 at 12:30 PM, Dai Griffiths 
> wrote:

Thanks Stan,

Yes, it's a powerful and useful process.
My problem is that in this list, and in other places were such matters are 
discussed, we don't seem to be able to agree on the big picture, and the higher 
up the generalisations we go, the less we agree.

I'd like to keep open the possibility that we might be yoking ideas together 
which it may 

Re: [Fis] What is information? and What is life?

2016-12-31 Thread Loet Leydesdorff
We agree that such a theory is a ways off, though you some are far more 
pessimisitic about its possibility than me. I believe that we would do best to 
focus on the hole that needs filling in rather than assuming that it is an 
unfillable given.

 

Dear Terrence and colleagues, 

 

It is not a matter of pessimism. We have the example of “General Systems 
Theory” of the 1930s (von Bertalanffy  and others). Only gradually, one 
realized the biological metaphor driving it. In my opinion, we have become 
reflexively skeptical about claims of “generality” because we know the 
statements are framed within paradigms. Translations are needed in this 
fractional manifold.

 

I agree that we are moving in a fruitful direction. Your book “Incomplete 
Nature” and “The Symbolic Species” have been important. The failing options 
cannot be observed, but have to be constructed culturally, that is, in 
discourse. It seems to me that we need a kind of calculus of redundancy. 
Perspectives which are reflexively aware of this need and do not assume an 
unproblematic “given” or “natural” are perhaps to be privileged nonetheless. 
The unobservbable options have first to be specified and we need theory 
(hypotheses) for this. Perhaps, this epistemological privilege can be used as a 
vantage point. 

 

There is an interesting relation to Husserl’s Critique of the European Sciences 
(1935): The failing (or forgotten) dimension is grounded in “intersubjective 
intentionality.” Nowadays, we would call this “discourse”. How are discourses 
structured and how can they be translated for the purpose of offering this 
“foundation”?

 

Happy New Year,

Loet

 

My modest suggestion is only that in the absence of a unifying theory we should 
not privilege one partial theory over others and that in the absence of a 
global general theory we need to find terminology that clearly identifies the 
level at which the concept is being used. Lacking this, we end up debating 
incompatible definitions, and defending our favored one that either excludes or 
includes issues of reference and significance or else assumes or denies the 
relevance of human interpreters. With different participants interested in 
different levels and applications of the information concept—from physics, to 
computation, to neuroscience, to biosemiotics, to language, to art, 
etc.—failure to mark this diversity will inevitably lead us in circles. 

 

I urge humility with precision and an eye toward synthesis.

 

Happy new year to all.\

 

— Terry

 

On Thu, Dec 29, 2016 at 12:30 PM, Dai Griffiths  
wrote:

Thanks Stan,

Yes, it's a powerful and useful process. 

My problem is that in this list, and in other places were such matters are 
discussed, we don't seem to be able to agree on the big picture, and the higher 
up the generalisations we go, the less we agree. 

I'd like to keep open the possibility that we might be yoking ideas together 
which it may be more useful to keep apart. We are dealing with messy concepts 
in messy configurations, which may not always map neatly onto a generalisation 
model. 

Dai





On 22/12/16 16:45, Stanley N Salthe wrote:

Dai --

{phenomenon 1}

{phenomenon 2}   -->  {Phenomena 1 & 2} ---> {phenomena 1.2,3}

{phenomenon 3}

The process from left to right is generalization.

‘Information’ IS a generalization.

generalities form the substance of philosophy. Info happens to a case

 of generalization which can be mathematized, which in turn allows

 it to be generalized even more.

So, what’s the problem?

STAN

 

On Wed, Dec 21, 2016 at 7:44 AM, Dai Griffiths  
wrote:

>  Information is not “something out there” which “exists” otherwise than as 
> our construct.

I agree with this. And I wonder to what extent our problems in discussing 
information come from our desire to shoe-horn many different phenomena into the 
same construct. It would be possible to disaggregate the construct. It be 
possible to discuss the topics which we address on this list without using the 
word 'information'. We could discuss redundancy, variety, constraint, meaning, 
structural coupling, coordination, expectation, language, etc.

In what ways would our explanations be weakened?

In what ways might we gain in clarity? 

If we were to go down this road, we would face the danger that our discussions 
might become (even more) remote from everyday human experience. But many 
scientific discussions are remote from everyday human experience.

Dai

On 20/12/16 08:26, Loet Leydesdorff wrote:

Dear colleagues, 

 

A distribution contains uncertainty that can be measured in terms of bits of 
information.

Alternatively: the expected information content H of a probability distribution 
is .

H is further defined as probabilistic entropy using Gibb’s formulation of the 
entropy .

 

This definition of information is an operational definition. In my opinion, we 
do not need an essentialistic 

Re: [Fis] What is information? and What is life?

2016-12-30 Thread Terrence W. DEACON
Thank you Francesco for a thoughtful commentary. I think that it is a
wonderful reflection with which to mark the end of this tumultuous year and
challenging discussions. Because I was moved by your senbtiment I crudely
translate your words below. I hope it captures some of the elegance of your
comment, especially the poetic last two lines. Thank you.

Dear Terry, Joseph and All,

Although it is difficult to pursue and achieve a degree of harmony despite
dis-agreements, or to find a concrete logic or practical philosophy that is
"good", "right" and "real",  in order to understand what exists and to
develop practical knowledge, there must be communication between humans
that can lead recursively to a co-ordination of meaning. COMMUNICATION
cannot be separated from information and inevitably has a sort of
economics. So I think that the “use-value” of a shape or form applies in
all fields, including physics, biology, mathematics, music, poetry, art,
sculpture, etc. Thus, a piece of iron is valued less than a nail and a nail
is valued less than a screw; a cell is valued less than a tissue and a
tissue is valued less than an organ and a body is valued less than an
organism; an undifferentiated stem cell (biological currency) is valued
more than a differentiated cell; a musical note or a color is valued less
than a musical score or a picture; a word is valued more than the
individual vowels or consonants and less than a poem; a mathematical symbol
is valued less than an equation or function; a point or a line is valued
less than a geometric figure, etc. All forms must be MEANT, which is why
the science of existence, or the existence of science, is ALWAYS BASED on
the Triad: signification, information, communication. Finally,
dis-equilibrium is vital and the breaking of symmetries or discontinuities
can be creative. So you have to get busy using the elective affinities or
synergies that are born between some of you or us to build, not to destroy,
in order to generalize knowledge. Rather than remove a brick, it is better
to insert one, not to build walls of separation or opposition, but bridges
of communication. There will be others who come after us to bring other
bricks.

On Thu, Dec 29, 2016 at 9:54 PM, Francesco Rizzo <
13francesco.ri...@gmail.com> wrote:

> Cari Terry, Joseph e Tutti,
> anche se è più difficile da perseguire e realizzare l'armonia del
> dis-accordo o la logica concreta o la filosofia pratica può essere "bella",
> "buona", "giusta" e "vera", per comprendere la prassi dell'esistenza e il
> dominio della conoscenza, nonché per svolgere la comunicazione tra gli
> esseri umani come coordinazione comportamentale ricorsiva descritta
> semanticamente. COMUNICAZIONE che non può prescindere dall'INFORMAZIONE (in
> economia, ad es., utilizzo il valore della forma o la forma del valore che
> secondo me vale in tutti i campi della fisica, della biologia, della
> matematica, della musica, della poesia, dall'arte, della scultura, etc.):
> un pezzo di ferro vale meno di un chiodo e un chiodo vale meno di una vite;
> una cellula vale meno di un tessuto e un tessuto vale meno di un organo e
> un organo vale meno di un organismo;una cellula staminale indifferenziata
> (moneta biologica) vale più di una cellula differenziata; una nota o un
> colore vale meno di uno spartito musicale o di un quadro; una parola vale
> più delle singole vocali o consonanti e meno di una poesia; un simbolo
> matematico vale meno di un'equazione o di una funzione; un punto o una
> linea vale meno di una figura geometrica, etc. Qualunque forma deve essere
> SIGNIFICATA, ecco perché la scienza dell'esistenza o l'esistenza della
> scienza è SEMPRE BASATA sulla Triade: significazione, informazione,
> comunicazione. Infine,il dis-equilibrio è vitale e la rottura delle
> simmetrie o le discontinuità sono creative.
> Quindi bisogna darsi da fare utilizzando le affinità elettive o sinergie
> che sono nate anche tra alcuni di Voi o di Noi: per costruire, non per
> distruggere arrivando dove si può arrivare per generalizzare il sapere:
> piuttosto che toglierlo un mattone è meglio metterlo, non per costruire
> muri di separazione o contrapposizioni, ma ponti di comunicazione. Saranno
> quelli che vengono dopo a portare altri mattoni.
> Francesco
>
> 2016-12-29 23:31 GMT+01:00 Terrence W. DEACON :
>
>> Dear Loet and others,
>>
>> I feel as though we are in search of a common general theory, but from
>> divergent perspectives and expectations. Of course we should not merely
>> assume a common general theopry of information if one doesn't yet exist. We
>> agree that such a theory is a ways off, though you some are far more
>> pessimisitic about its possibility than me. I believe that we would do best
>> to focus on the hole that needs filling in rather than assuming that it is
>> an unfillable given.
>>
>> My modest suggestion is only that in the absence of a unifying theory we
>> should not privilege one partial theory over 

Re: [Fis] What is information? and What is life?

2016-12-29 Thread Francesco Rizzo
Cari Terry, Joseph e Tutti,
anche se è più difficile da perseguire e realizzare l'armonia del
dis-accordo o la logica concreta o la filosofia pratica può essere "bella",
"buona", "giusta" e "vera", per comprendere la prassi dell'esistenza e il
dominio della conoscenza, nonché per svolgere la comunicazione tra gli
esseri umani come coordinazione comportamentale ricorsiva descritta
semanticamente. COMUNICAZIONE che non può prescindere dall'INFORMAZIONE (in
economia, ad es., utilizzo il valore della forma o la forma del valore che
secondo me vale in tutti i campi della fisica, della biologia, della
matematica, della musica, della poesia, dall'arte, della scultura, etc.):
un pezzo di ferro vale meno di un chiodo e un chiodo vale meno di una vite;
una cellula vale meno di un tessuto e un tessuto vale meno di un organo e
un organo vale meno di un organismo;una cellula staminale indifferenziata
(moneta biologica) vale più di una cellula differenziata; una nota o un
colore vale meno di uno spartito musicale o di un quadro; una parola vale
più delle singole vocali o consonanti e meno di una poesia; un simbolo
matematico vale meno di un'equazione o di una funzione; un punto o una
linea vale meno di una figura geometrica, etc. Qualunque forma deve essere
SIGNIFICATA, ecco perché la scienza dell'esistenza o l'esistenza della
scienza è SEMPRE BASATA sulla Triade: significazione, informazione,
comunicazione. Infine,il dis-equilibrio è vitale e la rottura delle
simmetrie o le discontinuità sono creative.
Quindi bisogna darsi da fare utilizzando le affinità elettive o sinergie
che sono nate anche tra alcuni di Voi o di Noi: per costruire, non per
distruggere arrivando dove si può arrivare per generalizzare il sapere:
piuttosto che toglierlo un mattone è meglio metterlo, non per costruire
muri di separazione o contrapposizioni, ma ponti di comunicazione. Saranno
quelli che vengono dopo a portare altri mattoni.
Francesco

2016-12-29 23:31 GMT+01:00 Terrence W. DEACON :

> Dear Loet and others,
>
> I feel as though we are in search of a common general theory, but from
> divergent perspectives and expectations. Of course we should not merely
> assume a common general theopry of information if one doesn't yet exist. We
> agree that such a theory is a ways off, though you some are far more
> pessimisitic about its possibility than me. I believe that we would do best
> to focus on the hole that needs filling in rather than assuming that it is
> an unfillable given.
>
> My modest suggestion is only that in the absence of a unifying theory we
> should not privilege one partial theory over others and that in the absence
> of a global general theory we need to find terminology that clearly
> identifies the level at which the concept is being used. Lacking this, we
> end up debating incompatible definitions, and defending our favored one
> that either excludes or includes issues of reference and significance or
> else assumes or denies the relevance of human interpreters. With different
> participants interested in different levels and applications of the
> information concept—from physics, to computation, to neuroscience, to
> biosemiotics, to language, to art, etc.—failure to mark this diversity will
> inevitably lead us in circles.
>
> I urge humility with precision and an eye toward synthesis.
>
> Happy new year to all.\
>
> — Terry
>
> On Thu, Dec 29, 2016 at 12:30 PM, Dai Griffiths  > wrote:
>
>> Thanks Stan,
>>
>> Yes, it's a powerful and useful process.
>> My problem is that in this list, and in other places were such matters
>> are discussed, we don't seem to be able to agree on the big picture, and
>> the higher up the generalisations we go, the less we agree.
>>
>> I'd like to keep open the possibility that we might be yoking ideas
>> together which it may be more useful to keep apart. We are dealing with
>> messy concepts in messy configurations, which may not always map neatly
>> onto a generalisation model.
>>
>> Dai
>>
>>
>>
>> On 22/12/16 16:45, Stanley N Salthe wrote:
>>
>> Dai --
>>
>> {phenomenon 1}
>>
>> {phenomenon 2}   -->  {Phenomena 1 & 2} ---> {phenomena 1.2,3}
>>
>> {phenomenon 3}
>>
>> The process from left to right is generalization.
>>
>> ‘Information’ IS a generalization.
>>
>> generalities form the substance of philosophy. Info happens to a case
>>
>>  of generalization which can be mathematized, which in turn allows
>>
>>  it to be generalized even more.
>>
>> So, what’s the problem?
>>
>> STAN
>>
>> On Wed, Dec 21, 2016 at 7:44 AM, Dai Griffiths > > wrote:
>>
>>> >  Information is not “something out there” which “exists” otherwise
>>> than as our construct.
>>>
>>> I agree with this. And I wonder to what extent our problems in
>>> discussing information come from our desire to shoe-horn many different
>>> phenomena into the same construct. It would be possible to disaggregate the
>>> construct. It be possible to 

Re: [Fis] What is information? and What is life?

2016-12-29 Thread Terrence W. DEACON
Dear Loet and others,

I feel as though we are in search of a common general theory, but from
divergent perspectives and expectations. Of course we should not merely
assume a common general theopry of information if one doesn't yet exist. We
agree that such a theory is a ways off, though you some are far more
pessimisitic about its possibility than me. I believe that we would do best
to focus on the hole that needs filling in rather than assuming that it is
an unfillable given.

My modest suggestion is only that in the absence of a unifying theory we
should not privilege one partial theory over others and that in the absence
of a global general theory we need to find terminology that clearly
identifies the level at which the concept is being used. Lacking this, we
end up debating incompatible definitions, and defending our favored one
that either excludes or includes issues of reference and significance or
else assumes or denies the relevance of human interpreters. With different
participants interested in different levels and applications of the
information concept—from physics, to computation, to neuroscience, to
biosemiotics, to language, to art, etc.—failure to mark this diversity will
inevitably lead us in circles.

I urge humility with precision and an eye toward synthesis.

Happy new year to all.\

— Terry

On Thu, Dec 29, 2016 at 12:30 PM, Dai Griffiths 
wrote:

> Thanks Stan,
>
> Yes, it's a powerful and useful process.
> My problem is that in this list, and in other places were such matters are
> discussed, we don't seem to be able to agree on the big picture, and the
> higher up the generalisations we go, the less we agree.
>
> I'd like to keep open the possibility that we might be yoking ideas
> together which it may be more useful to keep apart. We are dealing with
> messy concepts in messy configurations, which may not always map neatly
> onto a generalisation model.
>
> Dai
>
>
>
> On 22/12/16 16:45, Stanley N Salthe wrote:
>
> Dai --
>
> {phenomenon 1}
>
> {phenomenon 2}   -->  {Phenomena 1 & 2} ---> {phenomena 1.2,3}
>
> {phenomenon 3}
>
> The process from left to right is generalization.
>
> ‘Information’ IS a generalization.
>
> generalities form the substance of philosophy. Info happens to a case
>
>  of generalization which can be mathematized, which in turn allows
>
>  it to be generalized even more.
>
> So, what’s the problem?
>
> STAN
>
> On Wed, Dec 21, 2016 at 7:44 AM, Dai Griffiths 
> wrote:
>
>> >  Information is not “something out there” which “exists” otherwise than
>> as our construct.
>>
>> I agree with this. And I wonder to what extent our problems in discussing
>> information come from our desire to shoe-horn many different phenomena into
>> the same construct. It would be possible to disaggregate the construct. It
>> be possible to discuss the topics which we address on this list without
>> using the word 'information'. We could discuss redundancy, variety,
>> constraint, meaning, structural coupling, coordination, expectation,
>> language, etc.
>>
>> In what ways would our explanations be weakened?
>>
>> In what ways might we gain in clarity?
>>
>> If we were to go down this road, we would face the danger that our
>> discussions might become (even more) remote from everyday human experience.
>> But many scientific discussions are remote from everyday human experience.
>>
>> Dai
>> On 20/12/16 08:26, Loet Leydesdorff wrote:
>>
>> Dear colleagues,
>>
>>
>>
>> A distribution contains uncertainty that can be measured in terms of bits
>> of information.
>>
>> Alternatively: the expected information content *H *of a probability
>> distribution is .
>>
>> *H* is further defined as probabilistic entropy using Gibb’s formulation
>> of the entropy .
>>
>>
>>
>> This definition of information is an operational definition. In my
>> opinion, we do not need an essentialistic definition by answering the
>> question of “what is information?” As the discussion on this list
>> demonstrates, one does not easily agree on an essential answer; one can
>> answer the question “how is information defined?” Information is not
>> “something out there” which “exists” otherwise than as our construct.
>>
>>
>>
>> Using essentialistic definitions, the discussion tends not to move
>> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
>> of information “as natural selection assembling the very constraints on the
>> release of energy that then constitutes work and the propagation of
>> organization.” I asked several times what this means and how one can
>> measure this information. Hitherto, I only obtained the answer that
>> colleagues who disagree with me will be cited. J Another answer was that
>> “counting” may lead to populism. J
>>
>>
>>
>> Best,
>>
>> Loet
>>
>>
>> --
>>
>> Loet Leydesdorff
>>
>> Professor, University of Amsterdam
>> Amsterdam School of Communication Research (ASCoR)

Re: [Fis] What is information? and What is life?

2016-12-29 Thread Dai Griffiths

Thanks Stan,

Yes, it's a powerful and useful process.

My problem is that in this list, and in other places were such matters 
are discussed, we don't seem to be able to agree on the big picture, and 
the higher up the generalisations we go, the less we agree.


I'd like to keep open the possibility that we might be yoking ideas 
together which it may be more useful to keep apart. We are dealing with 
messy concepts in messy configurations, which may not always map neatly 
onto a generalisation model.


Dai


On 22/12/16 16:45, Stanley N Salthe wrote:


Dai --

{phenomenon 1}

{phenomenon 2}   -->  {Phenomena 1 & 2} ---> {phenomena 1.2,3}

{phenomenon 3}

The process from left to right is generalization.

‘Information’ IS a generalization.

generalities form the substance of philosophy. Info happens to a case

 of generalization which can be mathematized, which in turn allows

 it to be generalized even more.

So, what’s the problem?

STAN


On Wed, Dec 21, 2016 at 7:44 AM, Dai Griffiths 
> wrote:


>  Information is not “something out there” which “exists”
otherwise than as our construct.

I agree with this. And I wonder to what extent our problems in
discussing information come from our desire to shoe-horn many
different phenomena into the same construct. It would be possible
to disaggregate the construct. It be possible to discuss the
topics which we address on this list without using the word
'information'. We could discuss redundancy, variety, constraint,
meaning, structural coupling, coordination, expectation, language,
etc.

In what ways would our explanations be weakened?

In what ways might we gain in clarity?

If we were to go down this road, we would face the danger that our
discussions might become (even more) remote from everyday human
experience. But many scientific discussions are remote from
everyday human experience.

Dai

On 20/12/16 08:26, Loet Leydesdorff wrote:


Dear colleagues,

A distribution contains uncertainty that can be measured in terms
of bits of information.

Alternatively: the expected information content /H /of a
probability distribution is .

/H/is further defined as probabilistic entropy using Gibb’s
formulation of the entropy .

This definition of information is an operational definition. In
my opinion, we do not need an essentialistic definition by
answering the question of “what is information?” As the
discussion on this list demonstrates, one does not easily agree
on an essential answer; one can answer the question “how is
information defined?” Information is not “something out there”
which “exists” otherwise than as our construct.

Using essentialistic definitions, the discussion tends not to
move forward. For example, Stuart Kauffman’s and Bob Logan’s
(2007) definition of information “as natural selection assembling
the very constraints on the release of energy that then
constitutes work and the propagation of organization.” I asked
several times what this means and how one can measure this
information. Hitherto, I only obtained the answer that colleagues
who disagree with me will be cited. JAnother answer was that
“counting” may lead to populism. J

Best,

Loet



Loet Leydesdorff

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

l...@leydesdorff.net
 ;
http://www.leydesdorff.net/
Associate Faculty, SPRU,
University of Sussex;

Guest Professor Zhejiang Univ. ,
Hangzhou; Visiting Professor, ISTIC,
Beijing;

Visiting Professor, Birkbeck , University
of London;


http://scholar.google.com/citations?user=ych9gNYJ=en


*From:*Dick Stoute [mailto:dick.sto...@gmail.com]
*Sent:* Monday, December 19, 2016 12:48 PM
*To:* l...@leydesdorff.net 
*Cc:* James Peters; u...@umces.edu ; Alex
Hankey; FIS Webinar
*Subject:* Re: [Fis] What is information? and What is life?

List,

Please allow me to respond to Loet about the definition of
information stated below.

1. the definition of information as uncertainty is
counter-intuitive ("bizarre"); (p. 27)

I agree.  I struggled with this definition for a long time before
realising that Shannon was really discussing "amount of
information" or the number of bits needed to convey a message. 

[Fis] What is information? and What is life?

2016-12-26 Thread Christophe
Dear Loet,
You nicely illustrate the problem as a “hole“ in the center of the various 
perspectives. All these current and futures perspectives are indeed needed but 
it is true that “a general theory of information” remains terrribly 
challenging, precisely due to the sometimes orthogonal perspectives of the 
different theories, as you say.
Now,  perhaps the “hole” can be used as a image leading us far back in time 
when our universe was only about matter and energy. The evolution of our 
universe could then be used as a reference frame for the history of information.
Such time guided background can be used for all the various perspectives and 
also highlights pitfalls like the mysterious natures of life and human mind.
This brings us to take life as a starting point for the being of meaningful 
information (as said, information should not be separated from meaning. Weaver 
rightly recomended not to confuse meaning with information. It is not about 
separating them).
So we could begin by positioning our investigations between life and human mind 
to address the natures of information and meaning, which are realities at that 
level and can there be modeled in quite simple terms.
Then, being carefull with human mind, we could go to human management of 
information and consider human acheivements and current works: the measurement 
of quantity (channel capacity, Shannon), the formalizations (physical, 
referential, normative, syntactic, semantic, pragmatic, constraint satisfaction 
oriented,  your communcation/sharing of meaning or information, ...).
This does not really fill the “hole” but it brings in evolution as a thread 
which leads to start with the simplest task.
Wishing you and all FISers the best for this year end and for the coming 2017.
Christophe


De : Fis  de la part de Loet Leydesdorff 

Envoyé : lundi 26 décembre 2016 14:01
À : 'Terrence W. DEACON'; 'Francesco Rizzo'; 'fis'
Objet : Re: [Fis] What is information? and What is life?


In this respect Loet comments:



"In my opinion, the status of Shannon’s mathematical theory of information is 
different  from special theories of information (e.g., biological ones) since 
the formal theory enables us to translate between these latter theories."



We are essentially in agreement, and yet I would invert any perspective that 
prioritizes the approach pioneered by Shannon.



Dear Terrence and colleagues,



The inversion is fine with me as an exploration. But I don’t think that this 
can be done on programmatic grounds because of the assumed possibility of “a 
general theory of information”. I don’t think that such a theory exists or is 
even possible without assumptions that beg the question.



In other words, we have a “hole” in the center. Each perspective can claim its 
“generality” or fundamental character. For example, many of us entertain a 
biological a priori; others (including you?) reason on the basis of physics. 
The various (special) theories, however, are not juxtaposed; but can be 
considered as other (sometimes orthogonal) perspectives. Translations are 
possible at the bottom by unpacking in normal language or sometimes more 
formally (and advanced; productive?) using Shannon’s information theory and 
formalizations derived from it.



I admit my own communication-theoretical a priori. I am interested in the 
communication of knowledge as different from the communication of information. 
Discursive knowledge specifies and codifies meaning. The communication/sharing 
of meaning provides an in-between layer, which has also to be distinguished 
from the communication of information. Meaning is not relational but 
positional; it cannot be communicated, but it can be shared. I am currently 
working (with coauthors) on a full paper on the subject. The following is the 
provisional abstract:

As against a monadic reduction of knowledge and meaning to signal processing 
among neurons, we distinguish among information and meaning processing, and the 
possible codification of specific meanings as discursive knowledge. Whereas the 
Shannon-type information is coupled to the second law of thermodynamics, 
redundancy—that is, the complement of information to the maximum entropy—can be 
extended by further distinctions and the specification of expectations when new 
options are made feasible. With the opposite sign, the dynamics of knowledge 
production thus infuses the historical (e.g., institutional) dynamics with a 
cultural evolution. Meaning is provided from the perspective of hindsight as 
feedback on the entropy flow. The circling among dynamics in feedback and 
feedforward loops can be evaluated by the sign of mutual information. When 
mutual redundancy prevails, the resulting sign is negative indicating that more 
options are made available and innovation can be expected to flourish. The 
relation of this cultural evolution with the computation of 

Re: [Fis] What is information? and What is life?

2016-12-26 Thread Loet Leydesdorff
In this respect Loet comments:

 

"In my opinion, the status of Shannon’s mathematical theory of information is 
different  from special theories of information (e.g., biological ones) since 
the formal theory enables us to translate between these latter theories."

 

We are essentially in agreement, and yet I would invert any perspective that 
prioritizes the approach pioneered by Shannon. 

 

Dear Terrence and colleagues, 

 

The inversion is fine with me as an exploration. But I don’t think that this 
can be done on programmatic grounds because of the assumed possibility of “a 
general theory of information”. I don’t think that such a theory exists or is 
even possible without assumptions that beg the question. 

 

In other words, we have a “hole” in the center. Each perspective can claim its 
“generality” or fundamental character. For example, many of us entertain a 
biological a priori; others (including you?) reason on the basis of physics. 
The various (special) theories, however, are not juxtaposed; but can be 
considered as other (sometimes orthogonal) perspectives. Translations are 
possible at the bottom by unpacking in normal language or sometimes more 
formally (and advanced; productive?) using Shannon’s information theory and 
formalizations derived from it.

 

I admit my own communication-theoretical a priori. I am interested in the 
communication of knowledge as different from the communication of information. 
Discursive knowledge specifies and codifies meaning. The communication/sharing 
of meaning provides an in-between layer, which has also to be distinguished 
from the communication of information. Meaning is not relational but 
positional; it cannot be communicated, but it can be shared. I am currently 
working (with coauthors) on a full paper on the subject. The following is the 
provisional abstract: 

As against a monadic reduction of knowledge and meaning to signal processing 
among neurons, we distinguish among information and meaning processing, and the 
possible codification of specific meanings as discursive knowledge. Whereas the 
Shannon-type information is coupled to the second law of thermodynamics, 
redundancy—that is, the complement of information to the maximum entropy—can be 
extended by further distinctions and the specification of expectations when new 
options are made feasible. With the opposite sign, the dynamics of knowledge 
production thus infuses the historical (e.g., institutional) dynamics with a 
cultural evolution. Meaning is provided from the perspective of hindsight as 
feedback on the entropy flow. The circling among dynamics in feedback and 
feedforward loops can be evaluated by the sign of mutual information. When 
mutual redundancy prevails, the resulting sign is negative indicating that more 
options are made available and innovation can be expected to flourish. The 
relation of this cultural evolution with the computation of anticipatory 
systems can be specified; but the resulting puzzles are a subject for future 
research.

Best,

Loet

 

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

  l...@leydesdorff.net ;  
 http://www.leydesdorff.net/ 
Associate Faculty,   SPRU, University of Sussex; 

Guest Professor   Zhejiang Univ., Hangzhou; 
Visiting Professor,   ISTIC, Beijing;

Visiting Professor,   Birkbeck, University of London; 

  
http://scholar.google.com/citations?user=ych9gNYJ=en

 

 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What is information? and What is life?

2016-12-24 Thread Terrence W. DEACON
Dear colleagues,


I am entirely in agreement with the sentiments about mutual respect that
Loet recommends and the "harmony of knowledge" that Francesco promotes. But
I believe that this must also include a willingness to recognize that there
isn't a most basic theory; only what we might characterize as a currently
most thoroughly worked out analysis. But this is an analysis at the most
stripped down level—and which therefore necessarily ignores much that is
essential to a fuller analysis of information.


In this respect Loet comments:


"In my opinion, the status of Shannon’s mathematical theory of information
is different  from special theories of information (e.g., biological ones)
since the formal theory enables us to translate between these latter
theories."


We are essentially in agreement, and yet I would invert any perspective
that prioritizes the approach pioneered by Shannon. This analysis of the
signal properties that are necessary for conveying information does not
attempt to address the "higher order" properties that we pay attention to
in domains where reference and functional value are relevant (e.g. biology,
neuroscience, sociology, art). It necessarily brackets these aspects from
consideration. It thereby provides a common necessary but not sufficient tool
of analysis. More than a half century of development along these lines has
demonstrated that there are critical features of the information
relationship that cannot be reduced to intrinsic signal properties.


I have argued that there are basically two higher-order general properties
that constitute information: the referential relation and the
normative/functional value relation (with the term 'meaning' often used
somewhat ambiguously to refer to one or both of these properties). I do not
assume that these completely characterize all higher-order properties, and
so I would be open to discussing additional general attributes that fall
outside these domains, and which we need to also consider.


So I am not a fan of prioritizing the statistical conception of information
and considering all others to be "special" theories.


My hope for the field is that we will continue to work toward formalization
of these higher-order properties with the aim of embedding our current
"signal property analysis" within this larger theory. In this respect, I
would argue that the "mathematical theory" as currently developed is in
fact a "special theory," restricted to analyses where reference and
functional significance can be set aside (as in engineering applications),
and that the "general theory" remains to be formulated.


Since its inception, it has been recognized that the "mathematical theory
of communication" has used the term 'information' in a highly atypical
sense. I think that we would do well to keep this historical "accident" in
mind in order to avoid "information fundamentalism." This demands a sort of
humility in the face of the enormity of the challenge before us, not merely
a tolerance of "special" domains of application that don't completely
reduce to statistical analysis.


My proposal is that agreeing on terminological distinctions that support
such a paradigm inversion might provide a first step toward theoretical
convergence toward a "general theory" of information. I would welcome such
a discussion in the new year.


Happy holidays to all, Terry

On Sat, Dec 24, 2016 at 2:22 AM, Francesco Rizzo <
13francesco.ri...@gmail.com> wrote:

> Cari Tutti,
> ho scritto più volte le stesse cose per cui sono d'accordo con Voi,
> specialmente con gli ultimi intervenuti. E dato che sono un forestiero
> rispetto alle Vostre discipline, ma non uno straniero dell'armonia del
> sapere o del sapere dell'armonia, questo è una bella cosa. Auguri di buon
> Natale e per il nuovo anno.
> Francesco
>
> 2016-12-24 7:45 GMT+01:00 Loet Leydesdorff :
>
>> Dear Terrence and colleagues,
>>
>>
>>
>> I agree that we should not be fundamentalistic about “information”. For
>> example, one can also use “uncertainty” as an alternative word to
>> Shannon-type “information”. One can also make distinctions other than
>> semantic/syntactic/pragmatic, such as biological information, etc.
>>
>>
>>
>> Nevertheless, what makes this list to a common platform, in my opinion,
>> is our interest in the differences and similarities in the background of
>> these different notions of information. In my opinion, the status of
>> Shannon’s mathematical theory of information is different  from special
>> theories of information (e.g., biological ones) since the formal theory
>> enables us to translate between these latter theories. The translations are
>> heuristically important: they enable us to import metaphors from other
>> backgrounds (e.g., auto-catalysis).
>>
>>
>>
>> For example, one of us communicated with me why I was completely wrong,
>> and made the argument with reference to Kullback-Leibler divergence between
>> two probability distributions. Since 

Re: [Fis] What is information? and What is life?

2016-12-23 Thread Loet Leydesdorff
Dear Terrence and colleagues, 

 

I agree that we should not be fundamentalistic about “information”. For 
example, one can also use “uncertainty” as an alternative word to Shannon-type 
“information”. One can also make distinctions other than 
semantic/syntactic/pragmatic, such as biological information, etc.

 

Nevertheless, what makes this list to a common platform, in my opinion, is our 
interest in the differences and similarities in the background of these 
different notions of information. In my opinion, the status of Shannon’s 
mathematical theory of information is different  from special theories of 
information (e.g., biological ones) since the formal theory enables us to 
translate between these latter theories. The translations are heuristically 
important: they enable us to import metaphors from other backgrounds (e.g., 
auto-catalysis).

 

For example, one of us communicated with me why I was completely wrong, and 
made the argument with reference to Kullback-Leibler divergence between two 
probability distributions. Since we probably will not have “a general theory” 
of information, the apparatus in which information is formally and 
operationally defined—Bar-Hillel once called it “information calculus”—can 
carry this interdisciplinary function with precision and rigor. Otherwise, we 
can only be respectful of each other’s research traditions. J

 

I wish you all a splendid 2017,

Loet   

 

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

  l...@leydesdorff.net ;  
 http://www.leydesdorff.net/ 
Associate Faculty,   SPRU, University of Sussex; 

Guest Professor   Zhejiang Univ., Hangzhou; 
Visiting Professor,   ISTIC, Beijing;

Visiting Professor,   Birkbeck, University of London; 

  
http://scholar.google.com/citations?user=ych9gNYJ=en

 

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Terrence W. DEACON
Sent: Thursday, December 22, 2016 5:33 AM
To: fis
Subject: Re: [Fis] What is information? and What is life?

 

Against information fundamentalism

 

Rather than fighting over THE definition of information, I suggest that we 
stand back from the polemics for a moment and recognize that the term is being 
used in often quite incompatible ways in different domains, and that there may 
be value in paying attention to the advantages and costs of each. To ignore 
these differences, to fail to explore the links and dependencies between them, 
and to be indifferent to the different use values gained or sacrificed by each, 
I believe that we end up undermining the very enterprise we claim to be 
promoting.

 

We currently lack broadly accepted terms to unambiguously distinguish these 
divergent uses and, even worse, we lack a theoretical framework for 
understanding their relationships to one another.

So provisionally I would argue that we at least need to distinguish three 
hierarchically related uses of the concept:

 

1. Physical information: Information as intrinsically measurable medium 
properties with respect to their capacity to support 2 or 3 irrespective of any 
specific instantiation of 2 or 3.

 

2. Referential information: information as a non-intrinsic relation to 
something other than medium properties (1) that a given medium can provide 
(i.e. reference or content) irrespective of any specific instantiation of 3.

 

3. Normative information: Information as the use value provided by a given 
referential relation (2) with respect to an end-directed dynamic that is 
susceptible to contextual factors that are not directly accessible (i.e. 
functional value or significance).

 

Unfortunately, because of the history of using the same term in an unmodified 
way in each relevant domain irrespective of the others there are often 
pointless arguments of a purely definitional nature.

 

In linguistic theory an analogous three-part hierarchic partitioning of theory 
IS widely accepted. 

 

1. syntax

2. semantics

3. pragmatics

 

Thus by analogy some have proposed the distinction between

 

1. syntactic information (aka Shannon)

2. semantic information (aka meaning)

3. pragmatic information (aka useful information)

 

This has also often been applied to the philosophy of information (e.g. see The 
Stanford Dictionary of Philosophy entry for ‘information’). Unfortunately, the 
language-centric framing of this distinction can be somewhat misleading. The 
metaphoric extension of the terms ‘syntax’ and ‘semantics’ to apply to iconic 
(e.g. pictorial) or indexical (e.g. correlational) forms of communication 
exerts a subtle procrustean influence that obscures their naturalistic and 
nondigital features. This language bias is also often introduced with 

Re: [Fis] What is information? and What is life?

2016-12-22 Thread Dai Griffiths
>  Information is not “something out there” which “exists” otherwise 
than as our construct.


I agree with this. And I wonder to what extent our problems in 
discussing information come from our desire to shoe-horn many different 
phenomena into the same construct. It would be possible to disaggregate 
the construct. It be possible to discuss the topics which we address on 
this list without using the word 'information'. We could discuss 
redundancy, variety, constraint, meaning, structural coupling, 
coordination, expectation, language, etc.


In what ways would our explanations be weakened?

In what ways might we gain in clarity?

If we were to go down this road, we would face the danger that our 
discussions might become (even more) remote from everyday human 
experience. But many scientific discussions are remote from everyday 
human experience.


Dai

On 20/12/16 08:26, Loet Leydesdorff wrote:


Dear colleagues,

A distribution contains uncertainty that can be measured in terms of 
bits of information.


Alternatively: the expected information content /H /of a probability 
distribution is .


/H/is further defined as probabilistic entropy using Gibb’s 
formulation of the entropy .


This definition of information is an operational definition. In my 
opinion, we do not need an essentialistic definition by answering the 
question of “what is information?” As the discussion on this list 
demonstrates, one does not easily agree on an essential answer; one 
can answer the question “how is information defined?” Information is 
not “something out there” which “exists” otherwise than as our construct.


Using essentialistic definitions, the discussion tends not to move 
forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) 
definition of information “as natural selection assembling the very 
constraints on the release of energy that then constitutes work and 
the propagation of organization.” I asked several times what this 
means and how one can measure this information. Hitherto, I only 
obtained the answer that colleagues who disagree with me will be 
cited. JAnother answer was that “counting” may lead to populism. J


Best,

Loet



Loet Leydesdorff

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

l...@leydesdorff.net ; 
http://www.leydesdorff.net/
Associate Faculty, SPRU, University of 
Sussex;


Guest Professor Zhejiang Univ. , 
Hangzhou; Visiting Professor, ISTIC, 
Beijing;


Visiting Professor, Birkbeck , University of 
London;


http://scholar.google.com/citations?user=ych9gNYJ=en

*From:*Dick Stoute [mailto:dick.sto...@gmail.com]
*Sent:* Monday, December 19, 2016 12:48 PM
*To:* l...@leydesdorff.net
*Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
*Subject:* Re: [Fis] What is information? and What is life?

List,

Please allow me to respond to Loet about the definition of information 
stated below.


1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)


I agree.  I struggled with this definition for a long time before 
realising that Shannon was really discussing "amount of information" 
or the number of bits needed to convey a message.  He was looking for 
a formula that would provide an accurate estimate of the number of 
bits needed to convey a message and realised that the amount of 
information (number of bits) needed to convey a message was dependent 
on the "amount" of uncertainty that had to be eliminated and so he 
equated these.


It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of 
water in liters, but this does not tell us what water is and likewise 
the measure we use for "amount of information" does not tell us what 
information is. We can, for example equate the amount of water needed 
to fill a container with the volume of the container, but we should 
not think that water is therefore identical to an empty volume.  
Similarly we should not think that information is identical to 
uncertainty.


By equating the number of bits needed to convey a message with the 
"amount of uncertainty" that has to be eliminated Shannon, in effect, 
equated opposites so that he could get an estimate of the number of 
bits needed to eliminate the uncertainty.  We should not therefore 
consider that this equation establishes what information is.


Dick

On 18 December 2016 at 15:05, Loet Leydesdorff > wrote:


Dear James and colleagues,

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:


1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)


2. "In particular, 

Re: [Fis] What is information? and What is life?

2016-12-19 Thread Bob Logan
Dear Dick - I loved your analysis. You are right on the money. It also explains 
why Shannon dominated the field of information. He had a mathematical formula 
and there is nothing more appealing to a scientist than a mathematical formula. 
But you are right his formula only tells us of how many bits are needed to 
represent some information but tells us nothing about its meaning or its 
significance. As Marshall McLuhan said about Shannon information it is figure 
without ground. A figure only acquires meaning when one understands the ground 
in which it operates. So Shannon’s contribution to engineering is excellent but 
it tells us nothing about its nature or its impact as you wisely pointed out. 
Thanks for your insight.

I would like to refer to your insight the next time I write about info and want 
to attribute you correctly. Can you’ll me a bit about yourself like where you 
do your research. thanks - Bob Logan


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications


On Dec 19, 2016, at 6:48 AM, Dick Stoute  wrote:

List,

Please allow me to respond to Loet about the definition of information stated 
below.  

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)
 
I agree.  I struggled with this definition for a long time before realising 
that Shannon was really discussing "amount of information" or the number of 
bits needed to convey a message.  He was looking for a formula that would 
provide an accurate estimate of the number of bits needed to convey a message 
and realised that the amount of information (number of bits) needed to convey a 
message was dependent on the "amount" of uncertainty that had to be eliminated 
and so he equated these.  

It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of water in 
liters, but this does not tell us what water is and likewise the measure we use 
for "amount of information" does not tell us what information is. We can, for 
example equate the amount of water needed to fill a container with the volume 
of the container, but we should not think that water is therefore identical to 
an empty volume.  Similarly we should not think that information is identical 
to uncertainty.

By equating the number of bits needed to convey a message with the "amount of 
uncertainty" that has to be eliminated Shannon, in effect, equated opposites so 
that he could get an estimate of the number of bits needed to eliminate the 
uncertainty.  We should not therefore consider that this equation establishes 
what information is. 

Dick


On 18 December 2016 at 15:05, Loet Leydesdorff > wrote:
Dear James and colleagues,

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8)

 

The definition of information as relevant for a system of reference confuses 
information with "meaningful information" and thus sacrifices the surplus value 
of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as

 

physical interactions such photons stimulating the retina of the eye, 
human-machine interactions (this is the level that Shannon lives on), 
biological interaction such body temperature relative to touch ice or heat 
source, social interaction such as this forum started by Pedro, economic 
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Otherwise, one cannot measure meaningful 
information. In a previous series of communications we discussed redundancy 
from this perspective.

 

Lerner introduces mathematical expectation E[Sap] (difference between of a 
priory entropy [sic] and a posteriori entropy), which is distinguished from the 
notion of relative information Iap (Learner, page 7).

 

) expresses in bits of information the information generated when the a priori 
distribution is turned into the a posteriori one . This follows within the 
Shannon framework without needing an observer. I use this equation, for 
example, in my 1995-book The Challenge of Scientometrics (Chapters 8 and 9), 
with a reference to Theil (1972). The relative information is defined as the 
H/H(max).

 

I agree that the intuitive notion of information is derived from the Latin 
“in-formare” (Varela, 1979). But most of us do no longer use “force” and “mass” 
in the intuitive (Aristotelian) sense. J The proliferation of the meanings of 
information if confused with 

Re: [Fis] What is information? and What is life?

2016-12-19 Thread Dick Stoute
List,

Please allow me to respond to Loet about the definition of information
stated below.

1. the definition of information as uncertainty is counter-intuitive
("bizarre"); (p. 27)



I agree.  I struggled with this definition for a long time before realising
that Shannon was really discussing "amount of information" or the number of
bits needed to convey a message.  He was looking for a formula that would
provide an accurate estimate of the number of bits needed to convey a
message and realised that the amount of information (number of bits) needed
to convey a message was dependent on the "amount" of uncertainty that had
to be eliminated and so he equated these.


It makes sense to do this, but we must distinguish between "amount of
information" and "information".  For example, we can measure amount of
water in liters, but this does not tell us what water is and likewise the
measure we use for "amount of information" does not tell us what
information is. We can, for example equate the amount of water needed to
fill a container with the volume of the container, but we should not think
that water is therefore identical to an empty volume.  Similarly we should
not think that information is identical to uncertainty.


By equating the number of bits needed to convey a message with the "amount
of uncertainty" that has to be eliminated Shannon, in effect, equated
opposites so that he could get an estimate of the number of bits needed to
eliminate the uncertainty.  We should not therefore consider that this
equation establishes what information is.


Dick


On 18 December 2016 at 15:05, Loet Leydesdorff  wrote:

> Dear James and colleagues,
>
>
>
> Weaver (1949) made two major remarks about his coauthor (Shannon)'s
> contribution:
>
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
> 2. "In particular, information must not be confused with meaning." (p. 8)
>
>
>
> The definition of information as relevant for a system of reference
> confuses information with "meaningful information" and thus sacrifices the
> surplus value of Shannon's counter-intuitive definition.
>
>
>
> information observer
>
>
>
> that integrates interactive processes such as
>
>
>
> physical interactions such photons stimulating the retina of the eye,
> human-machine interactions (this is the level that Shannon lives on),
> biological interaction such body temperature relative to touch ice or heat
> source, social interaction such as this forum started by Pedro, economic
> interaction such as the stock market, ... [Lerner, page 1].
>
>
>
> We are in need of a theory of meaning. Otherwise, one cannot measure
> meaningful information. In a previous series of communications we discussed
> redundancy from this perspective.
>
>
>
> Lerner introduces mathematical expectation E[Sap] (difference between of a
> priory entropy [sic] and a posteriori entropy), which is distinguished from
> the notion of relative information Iap (Learner, page 7).
>
>
>
> ) expresses in bits of information the information generated when the a
> priori distribution is turned into the a posteriori one . This follows
> within the Shannon framework without needing an observer. I use this
> equation, for example, in my 1995-book *The Challenge of Scientometrics*
> (Chapters 8 and 9), with a reference to Theil (1972). The relative
> information is defined as the *H*/*H*(max).
>
>
>
> I agree that the intuitive notion of information is derived from the Latin
> “in-formare” (Varela, 1979). But most of us do no longer use “force” and
> “mass” in the intuitive (Aristotelian) sense. J The proliferation of the
> meanings of information if confused with “meaningful information” is
> indicative for an “index sui et falsi”, in my opinion. The repetitive
> discussion lames the progression at this list. It is “like asking whether a
> glass is half empty or half full” (Hayles, 1990, p. 59).
>
>
>
> This act of forming forming an information process results in the
> construction of an observer that is the owner [holder] of information.
>
>
>
> The system of reference is then no longer the message, but the observer
> who provides meaning to the information (uncertainty). I agree that this is
> a selection process, but the variation first has to be specified
> independently (before it can be selected.
>
>
>
> And Lerner introduces the threshold between objective and subjective
> observes (page 27).   This leads to a consideration selection and
> cooperation that includes entanglement.
>
>
>
> I don’t see a direct relation between information and entanglement. An
> observer can be entangled.
>
>
>
> Best,
>
> Loet
>
>
>
> PS. Pedro: Let me assume that this is my second posting in the week which
> ends tonight. L.
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>


-- 

4 Austin Dr. Prior Park St. James, Barbados 

Re: [Fis] What is information? and What is life?

2016-12-18 Thread Loet Leydesdorff
Dear James and colleagues, 

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s
contribution:

 

1. the definition of information as uncertainty is counter-intuitive
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8) 

 

The definition of information as relevant for a system of reference confuses
information with "meaningful information" and thus sacrifices the surplus
value of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as 

 

physical interactions such photons stimulating the retina of the eye,
human-machine interactions (this is the level that Shannon lives on),
biological interaction such body temperature relative to touch ice or heat
source, social interaction such as this forum started by Pedro, economic
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Otherwise, one cannot measure
meaningful information. In a previous series of communications we discussed
redundancy from this perspective.

 

Lerner introduces mathematical expectation E[Sap] (difference between of a
priory entropy [sic] and a posteriori entropy), which is distinguished from
the notion of relative information Iap (Learner, page 7).

 

) expresses in bits of information the information generated when the a
priori distribution is turned into the a posteriori one . This follows
within the Shannon framework without needing an observer. I use this
equation, for example, in my 1995-book The Challenge of Scientometrics
(Chapters 8 and 9), with a reference to Theil (1972). The relative
information is defined as the H/H(max).

 

I agree that the intuitive notion of information is derived from the Latin
"in-formare" (Varela, 1979). But most of us do no longer use "force" and
"mass" in the intuitive (Aristotelian) sense. J The proliferation of the
meanings of information if confused with "meaningful information" is
indicative for an "index sui et falsi", in my opinion. The repetitive
discussion lames the progression at this list. It is "like asking whether a
glass is half empty or half full" (Hayles, 1990, p. 59). 

 

This act of forming forming an information process results in the
construction of an observer that is the owner [holder] of information.

 

The system of reference is then no longer the message, but the observer who
provides meaning to the information (uncertainty). I agree that this is a
selection process, but the variation first has to be specified independently
(before it can be selected.

 

And Lerner introduces the threshold between objective and subjective
observes (page 27).   This leads to a consideration selection and
cooperation that includes entanglement.

 

I don't see a direct relation between information and entanglement. An
observer can be entangled.

 

Best, 

Loet

 

PS. Pedro: Let me assume that this is my second posting in the week which
ends tonight. L.

 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis