Re: [Fis] What is information? and What is life?

2017-01-11 Thread Christophe
Dear Terry,
Are you really sure that looking at linking Shannon to higher-order conceptions 
of information like meaning is a realistic ambition?
I compare that to linking the width of a street to the individual motivations 
of the persons that will walk in the street.
As we know, Shannon is to measure a communication channel capacity. It is not 
about the possible meanings of the information that may transit through the 
channel.
Information goes through a communication channel because agents want to 
communicate, to exchange meaningful information (the 'outside perspective' as 
you say). And meanings do not exist by themselves. Meaningful information are 
generated by agents that have reasons for that. Animals manage meanings in 
order to stay alive (as individual & as species). Human motivation/constraints 
are more complex but they are the sources of our meaning generations.
We agree that information is not to be confused with meaning. However, on a 
pragmatic standpoint the two cannot be separated. But this does not imply, I 
feel, that Shannon is to be linked to the meaning of information.
For me the core of the subject is with meaning generation. Why and how is 
meaningful information generated? (https://philpapers.org/rec/MENCOI)

All the best to all for 2017.
Christophe


De : Fis <fis-boun...@listas.unizar.es> de la part de Terrence W. DEACON 
<dea...@berkeley.edu>
Envoyé : samedi 7 janvier 2017 20:15
À : John Collier
Cc : Foundations of Information Science Information Science; Dai Griffiths
Objet : Re: [Fis] What is information? and What is life?

Leot remarks:

"... we need a kind of calculus of redundancy."

I agree whole-heartedly.

What for Shannon was the key to error-correction is thus implicitly normative. 
But of course assessment of normativity (accurate/inacurate, useful/unuseful, 
significant/insignificant) must necessarily involve an "outside" perspective, 
i.e. more than merely the statistics of sign medium chartacteristics. 
Redundancy is also implicit in concepts like communication, shared 
understanding, iconism, and Fano's "mutual information." But notice too that 
redundancy is precisely non-information in a strictly statistical understanding 
of that concept; a redundant message is not itself "news" — and yet it can 
reduce the uncertainty of what is "message" and what is "noise." It is my 
intuition that by developing a formalization (e.g. a "calculus") using the 
complemetary notions of redundancy and constraint that we will ultimately be 
able formulate a route from Shannon to the higher-order conceptions of 
information, in which referential and normative features can be precisely 
formulated.

There is an open door, though it still seems pretty dark on the other side. So 
one must risk stumbling in order to explore that space.

Happy 2017, Terry

On Sat, Jan 7, 2017 at 9:02 AM, John Collier 
<colli...@ukzn.ac.za<mailto:colli...@ukzn.ac.za>> wrote:
Dear List,

I agree with Terry that we should not be bound by our own partial theories. We 
need an integrated view of information that shows its relations in all of its 
various forms. There is a family resemblance in the ways it is used, and some 
sort of taxonomy can be constructed. I recommend that of Luciano Floridi. His 
approach is not unified (unlike my own, reported on this list), but compatible 
with it, and is a place to start, though it needs expansion and perhaps 
modification. There may be some unifying concept of information, but its 
application to all the various ways it has been used will not be obvious, and a 
sufficiently general formulation my well seem trivial, especially to those 
interested in the vital communicative and meaningful aspects of information. I 
also agree with Loet that pessimism, however justified, is not the real 
problem. To some extent it is a matter of maturity, which takes both time and 
development, not to mention giving up cherished juvenile enthusiasms.

I might add that constructivism, with its positivist underpinnings, tends to 
lead to nominalism and relativism about whatever is out there. I believe that 
this is a major hindrance to a unified understanding. I understand that it 
appeared in reaction to an overzealous and simplistic realism about science and 
other areas, but I think it through the baby out with the bathwater.

I have been really ill, so my lack of communication. I am pleased to see this 
discussion, which is necessary for the field to develop maturity. I thought I 
should add my bit, and with everyone a Happy New Year, with all its 
possibilities.

Warmest regards to everyone,
John

From: Fis 
[mailto:fis-boun...@listas.unizar.es<mailto:fis-boun...@listas.unizar.es>] On 
Behalf Of Loet Leydesdorff
Sent: December 31, 2016 12:16 AM
To: 'Terrence W. DEACON' <dea...@berkeley.edu<mailto:dea...@berkeley.edu>>; 

Re: [Fis] What is information? and What is life?; towards a calculus of redundancy

2017-01-10 Thread Loet Leydesdorff
Toward a Calculus of Redundancy:  <https://arxiv.org/abs/1701.02455> 
The feedback arrow of expectations in knowledge-based systems

Loet Leydesdorff, Mark W. Johnson, Inga Ivanova 

(Submitted on 10 Jan 2017; https://arxiv.org/abs/1701.02455 )

 

Whereas the generation of Shannon-type information is coupled to the second law 
of thermodynamics, redundancy--that is, the complement of information to the 
maximum entropy--can be increased by further distinctions: new options can 
discursively be generated. The dynamics of discursive knowledge production thus 
infuse the historical dynamics with a cultural evolution based on expectations 
(as different from observations). We distinguish among (i) the communication of 
information, (ii) the sharing of meaning, and (iii) discursive knowledge. 
Meaning is provided from the perspective of hindsight as feedback on the 
entropy flow and thus generates redundancy. Specific meanings can selectively 
be codified as discursive knowledge; knowledge-based reconstructions enable us 
to specify expectations about future states which can be invoked in the 
present. The cycling among the dynamics of information, meaning, and knowledge 
in feedback and feedforward loops can be evaluated empirically: When mutual 
redundancy prevails over mutual information, the sign of the resulting 
information is negative indicating reduction of uncertainty because of new 
options available for realization; innovation can then be expected to flourish. 
When historical realizations prevail, innovation may be locked-in because of 
insufficient options for further development. 

 

* Comments are very welcome in this stage

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

 <mailto:l...@leydesdorff.net> l...@leydesdorff.net ;  
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 
Associate Faculty,  <http://www.sussex.ac.uk/spru/> SPRU, University of Sussex; 

Guest Professor  <http://www.zju.edu.cn/english/> Zhejiang Univ., Hangzhou; 
Visiting Professor,  <http://www.istic.ac.cn/Eng/brief_en.html> ISTIC, Beijing;

Visiting Professor,  <http://www.bbk.ac.uk/> Birkbeck, University of London; 

 <http://scholar.google.com/citations?user=ych9gNYJ=en> 
http://scholar.google.com/citations?user=ych9gNYJ=en

 

From: Terrence W. DEACON [mailto:dea...@berkeley.edu] 
Sent: Saturday, January 07, 2017 8:15 PM
To: John Collier
Cc: l...@leydesdorff.net; Dai Griffiths; Foundations of Information Science 
Information Science
Subject: Re: [Fis] What is information? and What is life?

 

Leot remarks:

 

"... we need a kind of calculus of redundancy."

 

I agree whole-heartedly. 

 

What for Shannon was the key to error-correction is thus implicitly normative. 
But of course assessment of normativity (accurate/inacurate, useful/unuseful, 
significant/insignificant) must necessarily involve an "outside" perspective, 
i.e. more than merely the statistics of sign medium chartacteristics. 
Redundancy is also implicit in concepts like communication, shared 
understanding, iconism, and Fano's "mutual information." But notice too that 
redundancy is precisely non-information in a strictly statistical understanding 
of that concept; a redundant message is not itself "news" — and yet it can 
reduce the uncertainty of what is "message" and what is "noise." It is my 
intuition that by developing a formalization (e.g. a "calculus") using the 
complemetary notions of redundancy and constraint that we will ultimately be 
able formulate a route from Shannon to the higher-order conceptions of 
information, in which referential and normative features can be precisely 
formulated. 

 

There is an open door, though it still seems pretty dark on the other side. So 
one must risk stumbling in order to explore that space.

 

Happy 2017, Terry

 

On Sat, Jan 7, 2017 at 9:02 AM, John Collier <colli...@ukzn.ac.za> wrote:

Dear List,

 

I agree with Terry that we should not be bound by our own partial theories. We 
need an integrated view of information that shows its relations in all of its 
various forms. There is a family resemblance in the ways it is used, and some 
sort of taxonomy can be constructed. I recommend that of Luciano Floridi. His 
approach is not unified (unlike my own, reported on this list), but compatible 
with it, and is a place to start, though it needs expansion and perhaps 
modification. There may be some unifying concept of information, but its 
application to all the various ways it has been used will not be obvious, and a 
sufficiently general formulation my well seem trivial, especially to those 
interested in the vital communicative and meaningful aspects of information. I 
also agree with Loet that pessimism, however justified, is not the real 
problem. To some extent it is a matter o

Re: [Fis] What is information? and What is life?

2017-01-10 Thread Terrence W. DEACON
Leot remarks:

"... we need a kind of calculus of redundancy."

I agree whole-heartedly.

What for Shannon was the key to error-correction is thus implicitly
normative. But of course assessment of normativity (accurate/inacurate,
useful/unuseful, significant/insignificant) must necessarily involve an
"outside" perspective, i.e. more than merely the statistics of sign medium
chartacteristics. Redundancy is also implicit in concepts like
communication, shared understanding, iconism, and Fano's "mutual
information." But notice too that redundancy is precisely non-information
in a strictly statistical understanding of that concept; a redundant
message is not itself "news" — and yet it can reduce the uncertainty of
what is "message" and what is "noise." It is my intuition that by
developing a formalization (e.g. a "calculus") using the complemetary
notions of redundancy and constraint that we will ultimately be able
formulate a route from Shannon to the higher-order conceptions of
information, in which referential and normative features can be precisely
formulated.

There is an open door, though it still seems pretty dark on the other side.
So one must risk stumbling in order to explore that space.

Happy 2017, Terry

On Sat, Jan 7, 2017 at 9:02 AM, John Collier <colli...@ukzn.ac.za> wrote:

> Dear List,
>
>
>
> I agree with Terry that we should not be bound by our own partial
> theories. We need an integrated view of information that shows its
> relations in all of its various forms. There is a family resemblance in the
> ways it is used, and some sort of taxonomy can be constructed. I recommend
> that of Luciano Floridi. His approach is not unified (unlike my own,
> reported on this list), but compatible with it, and is a place to start,
> though it needs expansion and perhaps modification. There may be some
> unifying concept of information, but its application to all the various
> ways it has been used will not be obvious, and a sufficiently general
> formulation my well seem trivial, especially to those interested in the
> vital communicative and meaningful aspects of information. I also agree
> with Loet that pessimism, however justified, is not the real problem. To
> some extent it is a matter of maturity, which takes both time and
> development, not to mention giving up cherished juvenile enthusiasms.
>
>
>
> I might add that constructivism, with its positivist underpinnings, tends
> to lead to nominalism and relativism about whatever is out there. I believe
> that this is a major hindrance to a unified understanding. I understand
> that it appeared in reaction to an overzealous and simplistic realism about
> science and other areas, but I think it through the baby out with the
> bathwater.
>
>
>
> I have been really ill, so my lack of communication. I am pleased to see
> this discussion, which is necessary for the field to develop maturity. I
> thought I should add my bit, and with everyone a Happy New Year, with all
> its possibilities.
>
>
>
> Warmest regards to everyone,
>
> John
>
>
>
> *From:* Fis [mailto:fis-boun...@listas.unizar.es] *On Behalf Of *Loet
> Leydesdorff
> *Sent:* December 31, 2016 12:16 AM
> *To:* 'Terrence W. DEACON' <dea...@berkeley.edu>; 'Dai Griffiths' <
> dai.griffith...@gmail.com>; 'Foundations of Information Science
> Information Science' <fis@listas.unizar.es>
>
> *Subject:* Re: [Fis] What is information? and What is life?
>
>
>
> We agree that such a theory is a ways off, though you some are far more
> pessimisitic about its possibility than me. I believe that we would do best
> to focus on the hole that needs filling in rather than assuming that it is
> an unfillable given.
>
>
>
> Dear Terrence and colleagues,
>
>
>
> It is not a matter of pessimism. We have the example of “General Systems
> Theory” of the 1930s (von Bertalanffy  and others). Only gradually, one
> realized the biological metaphor driving it. In my opinion, we have become
> reflexively skeptical about claims of “generality” because we know the
> statements are framed within paradigms. Translations are needed in this
> fractional manifold.
>
>
>
> I agree that we are moving in a fruitful direction. Your book “Incomplete
> Nature” and “The Symbolic Species” have been important. The failing options
> cannot be observed, but have to be constructed culturally, that is, in
> discourse. It seems to me that we need a kind of calculus of redundancy.
> Perspectives which are reflexively aware of this need and do not assume an
> unproblematic “given” or “natural” are perhaps to be privileged
> nonetheless. The unobservbable options have first to be specified and we
> need

Re: [Fis] What is information? and What is life?

2017-01-10 Thread John Collier
Dear List,

I agree with Terry that we should not be bound by our own partial theories. We 
need an integrated view of information that shows its relations in all of its 
various forms. There is a family resemblance in the ways it is used, and some 
sort of taxonomy can be constructed. I recommend that of Luciano Floridi. His 
approach is not unified (unlike my own, reported on this list), but compatible 
with it, and is a place to start, though it needs expansion and perhaps 
modification. There may be some unifying concept of information, but its 
application to all the various ways it has been used will not be obvious, and a 
sufficiently general formulation my well seem trivial, especially to those 
interested in the vital communicative and meaningful aspects of information. I 
also agree with Loet that pessimism, however justified, is not the real 
problem. To some extent it is a matter of maturity, which takes both time and 
development, not to mention giving up cherished juvenile enthusiasms.

I might add that constructivism, with its positivist underpinnings, tends to 
lead to nominalism and relativism about whatever is out there. I believe that 
this is a major hindrance to a unified understanding. I understand that it 
appeared in reaction to an overzealous and simplistic realism about science and 
other areas, but I think it through the baby out with the bathwater.

I have been really ill, so my lack of communication. I am pleased to see this 
discussion, which is necessary for the field to develop maturity. I thought I 
should add my bit, and with everyone a Happy New Year, with all its 
possibilities.

Warmest regards to everyone,
John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Loet Leydesdorff
Sent: December 31, 2016 12:16 AM
To: 'Terrence W. DEACON' <dea...@berkeley.edu>; 'Dai Griffiths' 
<dai.griffith...@gmail.com>; 'Foundations of Information Science Information 
Science' <fis@listas.unizar.es>
Subject: Re: [Fis] What is information? and What is life?

We agree that such a theory is a ways off, though you some are far more 
pessimisitic about its possibility than me. I believe that we would do best to 
focus on the hole that needs filling in rather than assuming that it is an 
unfillable given.

Dear Terrence and colleagues,

It is not a matter of pessimism. We have the example of “General Systems 
Theory” of the 1930s (von Bertalanffy  and others). Only gradually, one 
realized the biological metaphor driving it. In my opinion, we have become 
reflexively skeptical about claims of “generality” because we know the 
statements are framed within paradigms. Translations are needed in this 
fractional manifold.

I agree that we are moving in a fruitful direction. Your book “Incomplete 
Nature” and “The Symbolic Species” have been important. The failing options 
cannot be observed, but have to be constructed culturally, that is, in 
discourse. It seems to me that we need a kind of calculus of redundancy. 
Perspectives which are reflexively aware of this need and do not assume an 
unproblematic “given” or “natural” are perhaps to be privileged nonetheless. 
The unobservbable options have first to be specified and we need theory 
(hypotheses) for this. Perhaps, this epistemological privilege can be used as a 
vantage point.

There is an interesting relation to Husserl’s Critique of the European Sciences 
(1935): The failing (or forgotten) dimension is grounded in “intersubjective 
intentionality.” Nowadays, we would call this “discourse”. How are discourses 
structured and how can they be translated for the purpose of offering this 
“foundation”?

Happy New Year,
Loet

My modest suggestion is only that in the absence of a unifying theory we should 
not privilege one partial theory over others and that in the absence of a 
global general theory we need to find terminology that clearly identifies the 
level at which the concept is being used. Lacking this, we end up debating 
incompatible definitions, and defending our favored one that either excludes or 
includes issues of reference and significance or else assumes or denies the 
relevance of human interpreters. With different participants interested in 
different levels and applications of the information concept—from physics, to 
computation, to neuroscience, to biosemiotics, to language, to art, 
etc.—failure to mark this diversity will inevitably lead us in circles.

I urge humility with precision and an eye toward synthesis.

Happy new year to all.\

— Terry

On Thu, Dec 29, 2016 at 12:30 PM, Dai Griffiths 
<dai.griffith...@gmail.com<mailto:dai.griffith...@gmail.com>> wrote:

Thanks Stan,

Yes, it's a powerful and useful process.
My problem is that in this list, and in other places were such matters are 
discussed, we don't seem to be able to agree on the big picture, and the higher 
up the generalisations we go, the less we agree.

I'd like to keep open the possibility that we might be yokin

Re: [Fis] What is information? and What is life?

2016-12-31 Thread Loet Leydesdorff
o not need an essentialistic definition by answering the question of “what is 
information?” As the discussion on this list demonstrates, one does not easily 
agree on an essential answer; one can answer the question “how is information 
defined?” Information is not “something out there” which “exists” otherwise 
than as our construct.

 

Using essentialistic definitions, the discussion tends not to move forward. For 
example, Stuart Kauffman’s and Bob Logan’s (2007) definition of information “as 
natural selection assembling the very constraints on the release of energy that 
then constitutes work and the propagation of organization.” I asked several 
times what this means and how one can measure this information. Hitherto, I 
only obtained the answer that colleagues who disagree with me will be cited. J 
Another answer was that “counting” may lead to populism. J 

 

Best,

Loet

 


  _  


Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

l...@leydesdorff.net ; http://www.leydesdorff.net/ 
Associate Faculty,  <http://www.sussex.ac.uk/spru/> SPRU, University of Sussex; 

Guest Professor  <http://www.zju.edu.cn/english/> Zhejiang Univ., Hangzhou; 
Visiting Professor,  <http://www.istic.ac.cn/Eng/brief_en.html> ISTIC, Beijing;

Visiting Professor,  <http://www.bbk.ac.uk/> Birkbeck, University of London; 

http://scholar.google.com/citations?user=ych9gNYJ=en

 

From: Dick Stoute [mailto:dick.sto...@gmail.com] 
Sent: Monday, December 19, 2016 12:48 PM
To: l...@leydesdorff.net
Cc: James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
Subject: Re: [Fis] What is information? and What is life?

 

List,

 

Please allow me to respond to Loet about the definition of information stated 
below.  

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

 

I agree.  I struggled with this definition for a long time before realising 
that Shannon was really discussing "amount of information" or the number of 
bits needed to convey a message.  He was looking for a formula that would 
provide an accurate estimate of the number of bits needed to convey a message 
and realised that the amount of information (number of bits) needed to convey a 
message was dependent on the "amount" of uncertainty that had to be eliminated 
and so he equated these.  

 

It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of water in 
liters, but this does not tell us what water is and likewise the measure we use 
for "amount of information" does not tell us what information is. We can, for 
example equate the amount of water needed to fill a container with the volume 
of the container, but we should not think that water is therefore identical to 
an empty volume.  Similarly we should not think that information is identical 
to uncertainty.

 

By equating the number of bits needed to convey a message with the "amount of 
uncertainty" that has to be eliminated Shannon, in effect, equated opposites so 
that he could get an estimate of the number of bits needed to eliminate the 
uncertainty.  We should not therefore consider that this equation establishes 
what information is. 

 

Dick

 

 

On 18 December 2016 at 15:05, Loet Leydesdorff <l...@leydesdorff.net> wrote:

Dear James and colleagues, 

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8) 

 

The definition of information as relevant for a system of reference confuses 
information with "meaningful information" and thus sacrifices the surplus value 
of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as 

 

physical interactions such photons stimulating the retina of the eye, 
human-machine interactions (this is the level that Shannon lives on), 
biological interaction such body temperature relative to touch ice or heat 
source, social interaction such as this forum started by Pedro, economic 
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Otherwise, one cannot measure meaningful 
information. In a previous series of communications we discussed redundancy 
from this perspective.

 

Lerner introduces mathematical expectation E[Sap] (difference between of a 
priory entropy [sic] and a posteriori entropy), which is distinguished from the 
notion of relative information Iap (Learner, page 7).

 

) expresses in bits of information the information generated when the a priori 
distribution is turned into the a posteriori one . This follows w

Re: [Fis] What is information? and What is life?

2016-12-30 Thread Terrence W. DEACON
>> answer the question “how is information defined?” Information is not
>>>> “something out there” which “exists” otherwise than as our construct.
>>>>
>>>>
>>>>
>>>> Using essentialistic definitions, the discussion tends not to move
>>>> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
>>>> of information “as natural selection assembling the very constraints on the
>>>> release of energy that then constitutes work and the propagation of
>>>> organization.” I asked several times what this means and how one can
>>>> measure this information. Hitherto, I only obtained the answer that
>>>> colleagues who disagree with me will be cited. J Another answer was
>>>> that “counting” may lead to populism. J
>>>>
>>>>
>>>>
>>>> Best,
>>>>
>>>> Loet
>>>>
>>>>
>>>> --
>>>>
>>>> Loet Leydesdorff
>>>>
>>>> Professor, University of Amsterdam
>>>> Amsterdam School of Communication Research (ASCoR)
>>>>
>>>> <l...@leydesdorff.net> <l...@leydesdorff.net>l...@leydesdorff.net ;
>>>> <http://www.leydesdorff.net/> <http://www.leydesdorff.net/>
>>>> http://www.leydesdorff.net/
>>>> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
>>>> Sussex;
>>>>
>>>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>>>> Hangzhou; Visiting Professor, ISTIC,
>>>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>>>>
>>>> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
>>>> London;
>>>>
>>>> <http://scholar.google.com/citations?user=ych9gNYJ=en>
>>>> <http://scholar.google.com/citations?user=ych9gNYJ=en>
>>>> http://scholar.google.com/citations?user=ych9gNYJ=en
>>>>
>>>>
>>>>
>>>> *From:* Dick Stoute [mailto:dick.sto...@gmail.com
>>>> <dick.sto...@gmail.com>]
>>>> *Sent:* Monday, December 19, 2016 12:48 PM
>>>> *To:* l...@leydesdorff.net
>>>> *Cc:* James Peters; <u...@umces.edu>u...@umces.edu; Alex Hankey; FIS
>>>> Webinar
>>>> *Subject:* Re: [Fis] What is information? and What is life?
>>>>
>>>>
>>>>
>>>> List,
>>>>
>>>>
>>>>
>>>> Please allow me to respond to Loet about the definition of information
>>>> stated below.
>>>>
>>>>
>>>>
>>>> 1. the definition of information as uncertainty is counter-intuitive
>>>> ("bizarre"); (p. 27)
>>>>
>>>>
>>>>
>>>> I agree.  I struggled with this definition for a long time before
>>>> realising that Shannon was really discussing "amount of information" or the
>>>> number of bits needed to convey a message.  He was looking for a formula
>>>> that would provide an accurate estimate of the number of bits needed to
>>>> convey a message and realised that the amount of information (number of
>>>> bits) needed to convey a message was dependent on the "amount" of
>>>> uncertainty that had to be eliminated and so he equated these.
>>>>
>>>>
>>>>
>>>> It makes sense to do this, but we must distinguish between "amount of
>>>> information" and "information".  For example, we can measure amount of
>>>> water in liters, but this does not tell us what water is and likewise the
>>>> measure we use for "amount of information" does not tell us what
>>>> information is. We can, for example equate the amount of water needed to
>>>> fill a container with the volume of the container, but we should not think
>>>> that water is therefore identical to an empty volume.  Similarly we should
>>>> not think that information is identical to uncertainty.
>>>>
>>>>
>>>>
>>>> By equating the number of bits needed to convey a message with the
>>>> "amount of uncertainty" that has to be eliminated Shannon, in effect,
>>>> equated opposites so that he could get an estimate of the number of bits
>>>> needed to eliminate the uncertainty.  We should not ther

Re: [Fis] What is information? and What is life?

2016-12-29 Thread Francesco Rizzo
hs <dai.griffith...@gmail.com
>> > wrote:
>>
>>> >  Information is not “something out there” which “exists” otherwise
>>> than as our construct.
>>>
>>> I agree with this. And I wonder to what extent our problems in
>>> discussing information come from our desire to shoe-horn many different
>>> phenomena into the same construct. It would be possible to disaggregate the
>>> construct. It be possible to discuss the topics which we address on this
>>> list without using the word 'information'. We could discuss redundancy,
>>> variety, constraint, meaning, structural coupling, coordination,
>>> expectation, language, etc.
>>>
>>> In what ways would our explanations be weakened?
>>>
>>> In what ways might we gain in clarity?
>>>
>>> If we were to go down this road, we would face the danger that our
>>> discussions might become (even more) remote from everyday human experience.
>>> But many scientific discussions are remote from everyday human experience.
>>>
>>> Dai
>>> On 20/12/16 08:26, Loet Leydesdorff wrote:
>>>
>>> Dear colleagues,
>>>
>>>
>>>
>>> A distribution contains uncertainty that can be measured in terms of
>>> bits of information.
>>>
>>> Alternatively: the expected information content *H *of a probability
>>> distribution is .
>>>
>>> *H* is further defined as probabilistic entropy using Gibb’s
>>> formulation of the entropy .
>>>
>>>
>>>
>>> This definition of information is an operational definition. In my
>>> opinion, we do not need an essentialistic definition by answering the
>>> question of “what is information?” As the discussion on this list
>>> demonstrates, one does not easily agree on an essential answer; one can
>>> answer the question “how is information defined?” Information is not
>>> “something out there” which “exists” otherwise than as our construct.
>>>
>>>
>>>
>>> Using essentialistic definitions, the discussion tends not to move
>>> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
>>> of information “as natural selection assembling the very constraints on the
>>> release of energy that then constitutes work and the propagation of
>>> organization.” I asked several times what this means and how one can
>>> measure this information. Hitherto, I only obtained the answer that
>>> colleagues who disagree with me will be cited. J Another answer was
>>> that “counting” may lead to populism. J
>>>
>>>
>>>
>>> Best,
>>>
>>> Loet
>>>
>>>
>>> --
>>>
>>> Loet Leydesdorff
>>>
>>> Professor, University of Amsterdam
>>> Amsterdam School of Communication Research (ASCoR)
>>>
>>> <l...@leydesdorff.net> <l...@leydesdorff.net>l...@leydesdorff.net ;
>>> <http://www.leydesdorff.net/> <http://www.leydesdorff.net/>
>>> http://www.leydesdorff.net/
>>> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
>>> Sussex;
>>>
>>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>>> Hangzhou; Visiting Professor, ISTIC,
>>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>>>
>>> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
>>> London;
>>>
>>> <http://scholar.google.com/citations?user=ych9gNYJ=en>
>>> <http://scholar.google.com/citations?user=ych9gNYJ=en>
>>> http://scholar.google.com/citations?user=ych9gNYJ=en
>>>
>>>
>>>
>>> *From:* Dick Stoute [mailto:dick.sto...@gmail.com
>>> <dick.sto...@gmail.com>]
>>> *Sent:* Monday, December 19, 2016 12:48 PM
>>> *To:* l...@leydesdorff.net
>>> *Cc:* James Peters; <u...@umces.edu>u...@umces.edu; Alex Hankey; FIS
>>> Webinar
>>> *Subject:* Re: [Fis] What is information? and What is life?
>>>
>>>
>>>
>>> List,
>>>
>>>
>>>
>>> Please allow me to respond to Loet about the definition of information
>>> stated below.
>>>
>>>
>>>
>>> 1. the definition of information as uncertainty is counter-intuitive
>>> ("bizarre"); (p. 27)
>>>
>>>
>>>
>>> I agree.  I struggled w

Re: [Fis] What is information? and What is life?

2016-12-29 Thread Terrence W. DEACON
pagation of
>> organization.” I asked several times what this means and how one can
>> measure this information. Hitherto, I only obtained the answer that
>> colleagues who disagree with me will be cited. J Another answer was that
>> “counting” may lead to populism. J
>>
>>
>>
>> Best,
>>
>> Loet
>>
>>
>> --
>>
>> Loet Leydesdorff
>>
>> Professor, University of Amsterdam
>> Amsterdam School of Communication Research (ASCoR)
>>
>> <l...@leydesdorff.net> <l...@leydesdorff.net>l...@leydesdorff.net ;
>> <http://www.leydesdorff.net/> <http://www.leydesdorff.net/>
>> http://www.leydesdorff.net/
>> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
>> Sussex;
>>
>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>> Hangzhou; Visiting Professor, ISTIC,
>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>>
>> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
>> London;
>>
>> <http://scholar.google.com/citations?user=ych9gNYJ=en>
>> <http://scholar.google.com/citations?user=ych9gNYJ=en>
>> http://scholar.google.com/citations?user=ych9gNYJ=en
>>
>>
>>
>> *From:* Dick Stoute [mailto:dick.sto...@gmail.com <dick.sto...@gmail.com>]
>>
>> *Sent:* Monday, December 19, 2016 12:48 PM
>> *To:* l...@leydesdorff.net
>> *Cc:* James Peters; <u...@umces.edu>u...@umces.edu; Alex Hankey; FIS
>> Webinar
>> *Subject:* Re: [Fis] What is information? and What is life?
>>
>>
>>
>> List,
>>
>>
>>
>> Please allow me to respond to Loet about the definition of information
>> stated below.
>>
>>
>>
>> 1. the definition of information as uncertainty is counter-intuitive
>> ("bizarre"); (p. 27)
>>
>>
>>
>> I agree.  I struggled with this definition for a long time before
>> realising that Shannon was really discussing "amount of information" or the
>> number of bits needed to convey a message.  He was looking for a formula
>> that would provide an accurate estimate of the number of bits needed to
>> convey a message and realised that the amount of information (number of
>> bits) needed to convey a message was dependent on the "amount" of
>> uncertainty that had to be eliminated and so he equated these.
>>
>>
>>
>> It makes sense to do this, but we must distinguish between "amount of
>> information" and "information".  For example, we can measure amount of
>> water in liters, but this does not tell us what water is and likewise the
>> measure we use for "amount of information" does not tell us what
>> information is. We can, for example equate the amount of water needed to
>> fill a container with the volume of the container, but we should not think
>> that water is therefore identical to an empty volume.  Similarly we should
>> not think that information is identical to uncertainty.
>>
>>
>>
>> By equating the number of bits needed to convey a message with the
>> "amount of uncertainty" that has to be eliminated Shannon, in effect,
>> equated opposites so that he could get an estimate of the number of bits
>> needed to eliminate the uncertainty.  We should not therefore consider that
>> this equation establishes what information is.
>>
>>
>>
>> Dick
>>
>>
>>
>>
>>
>> On 18 December 2016 at 15:05, Loet Leydesdorff < <l...@leydesdorff.net>
>> l...@leydesdorff.net> wrote:
>>
>> Dear James and colleagues,
>>
>>
>>
>> Weaver (1949) made two major remarks about his coauthor (Shannon)'s
>> contribution:
>>
>>
>>
>> 1. the definition of information as uncertainty is counter-intuitive
>> ("bizarre"); (p. 27)
>>
>> 2. "In particular, information must not be confused with meaning." (p. 8)
>>
>>
>>
>> The definition of information as relevant for a system of reference
>> confuses information with "meaningful information" and thus sacrifices the
>> surplus value of Shannon's counter-intuitive definition.
>>
>>
>>
>> information observer
>>
>>
>>
>> that integrates interactive processes such as
>>
>>
>>
>> physical interactions such photons stimulating the retina of the eye,
>> human-machine interactions (

Re: [Fis] What is information? and What is life?

2016-12-29 Thread Dai Griffiths

Thanks Stan,

Yes, it's a powerful and useful process.

My problem is that in this list, and in other places were such matters 
are discussed, we don't seem to be able to agree on the big picture, and 
the higher up the generalisations we go, the less we agree.


I'd like to keep open the possibility that we might be yoking ideas 
together which it may be more useful to keep apart. We are dealing with 
messy concepts in messy configurations, which may not always map neatly 
onto a generalisation model.


Dai


On 22/12/16 16:45, Stanley N Salthe wrote:


Dai --

{phenomenon 1}

{phenomenon 2}   -->  {Phenomena 1 & 2} ---> {phenomena 1.2,3}

{phenomenon 3}

The process from left to right is generalization.

‘Information’ IS a generalization.

generalities form the substance of philosophy. Info happens to a case

 of generalization which can be mathematized, which in turn allows

 it to be generalized even more.

So, what’s the problem?

STAN


On Wed, Dec 21, 2016 at 7:44 AM, Dai Griffiths 
<dai.griffith...@gmail.com <mailto:dai.griffith...@gmail.com>> wrote:


>  Information is not “something out there” which “exists”
otherwise than as our construct.

I agree with this. And I wonder to what extent our problems in
discussing information come from our desire to shoe-horn many
different phenomena into the same construct. It would be possible
to disaggregate the construct. It be possible to discuss the
topics which we address on this list without using the word
'information'. We could discuss redundancy, variety, constraint,
meaning, structural coupling, coordination, expectation, language,
etc.

In what ways would our explanations be weakened?

In what ways might we gain in clarity?

If we were to go down this road, we would face the danger that our
discussions might become (even more) remote from everyday human
experience. But many scientific discussions are remote from
everyday human experience.

Dai

On 20/12/16 08:26, Loet Leydesdorff wrote:


Dear colleagues,

A distribution contains uncertainty that can be measured in terms
of bits of information.

Alternatively: the expected information content /H /of a
probability distribution is .

/H/is further defined as probabilistic entropy using Gibb’s
formulation of the entropy .

This definition of information is an operational definition. In
my opinion, we do not need an essentialistic definition by
answering the question of “what is information?” As the
discussion on this list demonstrates, one does not easily agree
on an essential answer; one can answer the question “how is
information defined?” Information is not “something out there”
which “exists” otherwise than as our construct.

Using essentialistic definitions, the discussion tends not to
move forward. For example, Stuart Kauffman’s and Bob Logan’s
(2007) definition of information “as natural selection assembling
the very constraints on the release of energy that then
constitutes work and the propagation of organization.” I asked
several times what this means and how one can measure this
information. Hitherto, I only obtained the answer that colleagues
who disagree with me will be cited. JAnother answer was that
“counting” may lead to populism. J

Best,

Loet



Loet Leydesdorff

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

<mailto:l...@leydesdorff.net>l...@leydesdorff.net
<mailto:l...@leydesdorff.net> ;
<http://www.leydesdorff.net/>http://www.leydesdorff.net/
Associate Faculty, SPRU,
<http://www.sussex.ac.uk/spru/>University of Sussex;

Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
Hangzhou; Visiting Professor, ISTIC,
<http://www.istic.ac.cn/Eng/brief_en.html>Beijing;

Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University
of London;


<http://scholar.google.com/citations?user=ych9gNYJ=en>http://scholar.google.com/citations?user=ych9gNYJ=en
<http://scholar.google.com/citations?user=ych9gNYJ=en>

*From:*Dick Stoute [mailto:dick.sto...@gmail.com]
*Sent:* Monday, December 19, 2016 12:48 PM
*To:* l...@leydesdorff.net <mailto:l...@leydesdorff.net>
*Cc:* James Peters; u...@umces.edu <mailto:u...@umces.edu>; Alex
Hankey; FIS Webinar
*Subject:* Re: [Fis] What is information? and What is life?

List,

Please allow me to respond to Loet about the definition of
information stated below.

1. the definition of information as uncertainty is
counter-intuitive ("bizarre"); (p. 27)

I agree.  I struggled with this definition for a long time before
realising that Shann

[Fis] What is information? and What is life?

2016-12-26 Thread Christophe
Dear Loet,
You nicely illustrate the problem as a “hole“ in the center of the various 
perspectives. All these current and futures perspectives are indeed needed but 
it is true that “a general theory of information” remains terrribly 
challenging, precisely due to the sometimes orthogonal perspectives of the 
different theories, as you say.
Now,  perhaps the “hole” can be used as a image leading us far back in time 
when our universe was only about matter and energy. The evolution of our 
universe could then be used as a reference frame for the history of information.
Such time guided background can be used for all the various perspectives and 
also highlights pitfalls like the mysterious natures of life and human mind.
This brings us to take life as a starting point for the being of meaningful 
information (as said, information should not be separated from meaning. Weaver 
rightly recomended not to confuse meaning with information. It is not about 
separating them).
So we could begin by positioning our investigations between life and human mind 
to address the natures of information and meaning, which are realities at that 
level and can there be modeled in quite simple terms.
Then, being carefull with human mind, we could go to human management of 
information and consider human acheivements and current works: the measurement 
of quantity (channel capacity, Shannon), the formalizations (physical, 
referential, normative, syntactic, semantic, pragmatic, constraint satisfaction 
oriented,  your communcation/sharing of meaning or information, ...).
This does not really fill the “hole” but it brings in evolution as a thread 
which leads to start with the simplest task.
Wishing you and all FISers the best for this year end and for the coming 2017.
Christophe


De : Fis <fis-boun...@listas.unizar.es> de la part de Loet Leydesdorff 
<l...@leydesdorff.net>
Envoyé : lundi 26 décembre 2016 14:01
À : 'Terrence W. DEACON'; 'Francesco Rizzo'; 'fis'
Objet : Re: [Fis] What is information? and What is life?


In this respect Loet comments:



"In my opinion, the status of Shannon’s mathematical theory of information is 
different  from special theories of information (e.g., biological ones) since 
the formal theory enables us to translate between these latter theories."



We are essentially in agreement, and yet I would invert any perspective that 
prioritizes the approach pioneered by Shannon.



Dear Terrence and colleagues,



The inversion is fine with me as an exploration. But I don’t think that this 
can be done on programmatic grounds because of the assumed possibility of “a 
general theory of information”. I don’t think that such a theory exists or is 
even possible without assumptions that beg the question.



In other words, we have a “hole” in the center. Each perspective can claim its 
“generality” or fundamental character. For example, many of us entertain a 
biological a priori; others (including you?) reason on the basis of physics. 
The various (special) theories, however, are not juxtaposed; but can be 
considered as other (sometimes orthogonal) perspectives. Translations are 
possible at the bottom by unpacking in normal language or sometimes more 
formally (and advanced; productive?) using Shannon’s information theory and 
formalizations derived from it.



I admit my own communication-theoretical a priori. I am interested in the 
communication of knowledge as different from the communication of information. 
Discursive knowledge specifies and codifies meaning. The communication/sharing 
of meaning provides an in-between layer, which has also to be distinguished 
from the communication of information. Meaning is not relational but 
positional; it cannot be communicated, but it can be shared. I am currently 
working (with coauthors) on a full paper on the subject. The following is the 
provisional abstract:

As against a monadic reduction of knowledge and meaning to signal processing 
among neurons, we distinguish among information and meaning processing, and the 
possible codification of specific meanings as discursive knowledge. Whereas the 
Shannon-type information is coupled to the second law of thermodynamics, 
redundancy—that is, the complement of information to the maximum entropy—can be 
extended by further distinctions and the specification of expectations when new 
options are made feasible. With the opposite sign, the dynamics of knowledge 
production thus infuses the historical (e.g., institutional) dynamics with a 
cultural evolution. Meaning is provided from the perspective of hindsight as 
feedback on the entropy flow. The circling among dynamics in feedback and 
feedforward loops can be evaluated by the sign of mutual information. When 
mutual redundancy prevails, the resulting sign is negative indicating that more 
options are made available and innovation can be expected to flourish. The 
relation of this cultural evolutio

Re: [Fis] What is information? and What is life?

2016-12-26 Thread Loet Leydesdorff
In this respect Loet comments:

 

"In my opinion, the status of Shannon’s mathematical theory of information is 
different  from special theories of information (e.g., biological ones) since 
the formal theory enables us to translate between these latter theories."

 

We are essentially in agreement, and yet I would invert any perspective that 
prioritizes the approach pioneered by Shannon. 

 

Dear Terrence and colleagues, 

 

The inversion is fine with me as an exploration. But I don’t think that this 
can be done on programmatic grounds because of the assumed possibility of “a 
general theory of information”. I don’t think that such a theory exists or is 
even possible without assumptions that beg the question. 

 

In other words, we have a “hole” in the center. Each perspective can claim its 
“generality” or fundamental character. For example, many of us entertain a 
biological a priori; others (including you?) reason on the basis of physics. 
The various (special) theories, however, are not juxtaposed; but can be 
considered as other (sometimes orthogonal) perspectives. Translations are 
possible at the bottom by unpacking in normal language or sometimes more 
formally (and advanced; productive?) using Shannon’s information theory and 
formalizations derived from it.

 

I admit my own communication-theoretical a priori. I am interested in the 
communication of knowledge as different from the communication of information. 
Discursive knowledge specifies and codifies meaning. The communication/sharing 
of meaning provides an in-between layer, which has also to be distinguished 
from the communication of information. Meaning is not relational but 
positional; it cannot be communicated, but it can be shared. I am currently 
working (with coauthors) on a full paper on the subject. The following is the 
provisional abstract: 

As against a monadic reduction of knowledge and meaning to signal processing 
among neurons, we distinguish among information and meaning processing, and the 
possible codification of specific meanings as discursive knowledge. Whereas the 
Shannon-type information is coupled to the second law of thermodynamics, 
redundancy—that is, the complement of information to the maximum entropy—can be 
extended by further distinctions and the specification of expectations when new 
options are made feasible. With the opposite sign, the dynamics of knowledge 
production thus infuses the historical (e.g., institutional) dynamics with a 
cultural evolution. Meaning is provided from the perspective of hindsight as 
feedback on the entropy flow. The circling among dynamics in feedback and 
feedforward loops can be evaluated by the sign of mutual information. When 
mutual redundancy prevails, the resulting sign is negative indicating that more 
options are made available and innovation can be expected to flourish. The 
relation of this cultural evolution with the computation of anticipatory 
systems can be specified; but the resulting puzzles are a subject for future 
research.

Best,

Loet

 

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

  l...@leydesdorff.net ;  
 http://www.leydesdorff.net/ 
Associate Faculty,   SPRU, University of Sussex; 

Guest Professor   Zhejiang Univ., Hangzhou; 
Visiting Professor,   ISTIC, Beijing;

Visiting Professor,   Birkbeck, University of London; 

  
http://scholar.google.com/citations?user=ych9gNYJ=en

 

 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What is information? and What is life?

2016-12-24 Thread Terrence W. DEACON
able us to import metaphors from other
>> backgrounds (e.g., auto-catalysis).
>>
>>
>>
>> For example, one of us communicated with me why I was completely wrong,
>> and made the argument with reference to Kullback-Leibler divergence between
>> two probability distributions. Since we probably will not have “a general
>> theory” of information, the apparatus in which information is formally and
>> operationally defined—Bar-Hillel once called it “information calculus”—can
>> carry this interdisciplinary function with precision and rigor. Otherwise,
>> we can only be respectful of each other’s research traditions. J
>>
>>
>>
>> I wish you all a splendid 2017,
>>
>> Loet
>>
>>
>> --
>>
>> Loet Leydesdorff
>>
>> Professor, University of Amsterdam
>> Amsterdam School of Communication Research (ASCoR)
>>
>> l...@leydesdorff.net ; http://www.leydesdorff.net/
>> Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of
>> Sussex;
>>
>> Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>,
>> Hangzhou; Visiting Professor, ISTIC,
>> <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;
>>
>> Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of
>> London;
>>
>> http://scholar.google.com/citations?user=ych9gNYJ=en
>>
>>
>>
>> *From:* Fis [mailto:fis-boun...@listas.unizar.es] *On Behalf Of *Terrence
>> W. DEACON
>> *Sent:* Thursday, December 22, 2016 5:33 AM
>> *To:* fis
>>
>> *Subject:* Re: [Fis] What is information? and What is life?
>>
>>
>>
>> Against information fundamentalism
>>
>>
>>
>> Rather than fighting over THE definition of information, I suggest that
>> we stand back from the polemics for a moment and recognize that the term is
>> being used in often quite incompatible ways in different domains, and that
>> there may be value in paying attention to the advantages and costs of each.
>> To ignore these differences, to fail to explore the links and dependencies
>> between them, and to be indifferent to the different use values gained or
>> sacrificed by each, I believe that we end up undermining the very
>> enterprise we claim to be promoting.
>>
>>
>>
>> We currently lack broadly accepted terms to unambiguously distinguish
>> these divergent uses and, even worse, we lack a theoretical framework for
>> understanding their relationships to one another.
>>
>> So provisionally I would argue that we at least need to distinguish three
>> hierarchically related uses of the concept:
>>
>>
>>
>> 1. Physical information: Information as intrinsically measurable medium
>> properties with respect to their capacity to support 2 or 3 irrespective of
>> any specific instantiation of 2 or 3.
>>
>>
>>
>> 2. Referential information: information as a non-intrinsic relation to
>> something other than medium properties (1) that a given medium can provide
>> (i.e. reference or content) irrespective of any specific instantiation of 3.
>>
>>
>>
>> 3. Normative information: Information as the use value provided by a
>> given referential relation (2) with respect to an end-directed dynamic that
>> is susceptible to contextual factors that are not directly accessible (i.e.
>> functional value or significance).
>>
>>
>>
>> Unfortunately, because of the history of using the same term in an
>> unmodified way in each relevant domain irrespective of the others there are
>> often pointless arguments of a purely definitional nature.
>>
>>
>>
>> In linguistic theory an analogous three-part hierarchic partitioning of
>> theory IS widely accepted.
>>
>>
>>
>> 1. syntax
>>
>> 2. semantics
>>
>> 3. pragmatics
>>
>>
>>
>> Thus by analogy some have proposed the distinction between
>>
>>
>>
>> 1. syntactic information (aka Shannon)
>>
>> 2. semantic information (aka meaning)
>>
>> 3. pragmatic information (aka useful information)
>>
>>
>>
>> This has also often been applied to the philosophy of information (e.g.
>> see The Stanford Dictionary of Philosophy entry for ‘information’).
>> Unfortunately, the language-centric framing of this distinction can be
>> somewhat misleading. The metaphoric extension of the terms ‘syntax’ and
>> ‘semantics’ to apply to iconic (e.g. pictorial) or indexical (e.g.
>&

Re: [Fis] What is information? and What is life?

2016-12-23 Thread Loet Leydesdorff
Dear Terrence and colleagues, 

 

I agree that we should not be fundamentalistic about “information”. For 
example, one can also use “uncertainty” as an alternative word to Shannon-type 
“information”. One can also make distinctions other than 
semantic/syntactic/pragmatic, such as biological information, etc.

 

Nevertheless, what makes this list to a common platform, in my opinion, is our 
interest in the differences and similarities in the background of these 
different notions of information. In my opinion, the status of Shannon’s 
mathematical theory of information is different  from special theories of 
information (e.g., biological ones) since the formal theory enables us to 
translate between these latter theories. The translations are heuristically 
important: they enable us to import metaphors from other backgrounds (e.g., 
auto-catalysis).

 

For example, one of us communicated with me why I was completely wrong, and 
made the argument with reference to Kullback-Leibler divergence between two 
probability distributions. Since we probably will not have “a general theory” 
of information, the apparatus in which information is formally and 
operationally defined—Bar-Hillel once called it “information calculus”—can 
carry this interdisciplinary function with precision and rigor. Otherwise, we 
can only be respectful of each other’s research traditions. J

 

I wish you all a splendid 2017,

Loet   

 

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

 <mailto:l...@leydesdorff.net> l...@leydesdorff.net ;  
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 
Associate Faculty,  <http://www.sussex.ac.uk/spru/> SPRU, University of Sussex; 

Guest Professor  <http://www.zju.edu.cn/english/> Zhejiang Univ., Hangzhou; 
Visiting Professor,  <http://www.istic.ac.cn/Eng/brief_en.html> ISTIC, Beijing;

Visiting Professor,  <http://www.bbk.ac.uk/> Birkbeck, University of London; 

 <http://scholar.google.com/citations?user=ych9gNYJ=en> 
http://scholar.google.com/citations?user=ych9gNYJ=en

 

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Terrence W. DEACON
Sent: Thursday, December 22, 2016 5:33 AM
To: fis
Subject: Re: [Fis] What is information? and What is life?

 

Against information fundamentalism

 

Rather than fighting over THE definition of information, I suggest that we 
stand back from the polemics for a moment and recognize that the term is being 
used in often quite incompatible ways in different domains, and that there may 
be value in paying attention to the advantages and costs of each. To ignore 
these differences, to fail to explore the links and dependencies between them, 
and to be indifferent to the different use values gained or sacrificed by each, 
I believe that we end up undermining the very enterprise we claim to be 
promoting.

 

We currently lack broadly accepted terms to unambiguously distinguish these 
divergent uses and, even worse, we lack a theoretical framework for 
understanding their relationships to one another.

So provisionally I would argue that we at least need to distinguish three 
hierarchically related uses of the concept:

 

1. Physical information: Information as intrinsically measurable medium 
properties with respect to their capacity to support 2 or 3 irrespective of any 
specific instantiation of 2 or 3.

 

2. Referential information: information as a non-intrinsic relation to 
something other than medium properties (1) that a given medium can provide 
(i.e. reference or content) irrespective of any specific instantiation of 3.

 

3. Normative information: Information as the use value provided by a given 
referential relation (2) with respect to an end-directed dynamic that is 
susceptible to contextual factors that are not directly accessible (i.e. 
functional value or significance).

 

Unfortunately, because of the history of using the same term in an unmodified 
way in each relevant domain irrespective of the others there are often 
pointless arguments of a purely definitional nature.

 

In linguistic theory an analogous three-part hierarchic partitioning of theory 
IS widely accepted. 

 

1. syntax

2. semantics

3. pragmatics

 

Thus by analogy some have proposed the distinction between

 

1. syntactic information (aka Shannon)

2. semantic information (aka meaning)

3. pragmatic information (aka useful information)

 

This has also often been applied to the philosophy of information (e.g. see The 
Stanford Dictionary of Philosophy entry for ‘information’). Unfortunately, the 
language-centric framing of this distinction can be somewhat misleading. The 
metaphoric extension of the terms ‘syntax’ and ‘semantics’ to apply to iconic 
(e.g. pictorial) or indexical (e.g. correlational) forms of communication 
exerts a subtle procrustean influence that obscures their naturalistic and 
nondigital features. T

Re: [Fis] What is information? and What is life?

2016-12-22 Thread Dai Griffiths
>  Information is not “something out there” which “exists” otherwise 
than as our construct.


I agree with this. And I wonder to what extent our problems in 
discussing information come from our desire to shoe-horn many different 
phenomena into the same construct. It would be possible to disaggregate 
the construct. It be possible to discuss the topics which we address on 
this list without using the word 'information'. We could discuss 
redundancy, variety, constraint, meaning, structural coupling, 
coordination, expectation, language, etc.


In what ways would our explanations be weakened?

In what ways might we gain in clarity?

If we were to go down this road, we would face the danger that our 
discussions might become (even more) remote from everyday human 
experience. But many scientific discussions are remote from everyday 
human experience.


Dai

On 20/12/16 08:26, Loet Leydesdorff wrote:


Dear colleagues,

A distribution contains uncertainty that can be measured in terms of 
bits of information.


Alternatively: the expected information content /H /of a probability 
distribution is .


/H/is further defined as probabilistic entropy using Gibb’s 
formulation of the entropy .


This definition of information is an operational definition. In my 
opinion, we do not need an essentialistic definition by answering the 
question of “what is information?” As the discussion on this list 
demonstrates, one does not easily agree on an essential answer; one 
can answer the question “how is information defined?” Information is 
not “something out there” which “exists” otherwise than as our construct.


Using essentialistic definitions, the discussion tends not to move 
forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) 
definition of information “as natural selection assembling the very 
constraints on the release of energy that then constitutes work and 
the propagation of organization.” I asked several times what this 
means and how one can measure this information. Hitherto, I only 
obtained the answer that colleagues who disagree with me will be 
cited. JAnother answer was that “counting” may lead to populism. J


Best,

Loet



Loet Leydesdorff

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

l...@leydesdorff.net <mailto:l...@leydesdorff.net>; 
http://www.leydesdorff.net/
Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of 
Sussex;


Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>, 
Hangzhou; Visiting Professor, ISTIC, 
<http://www.istic.ac.cn/Eng/brief_en.html>Beijing;


Visiting Professor, Birkbeck <http://www.bbk.ac.uk/>, University of 
London;


http://scholar.google.com/citations?user=ych9gNYJ=en

*From:*Dick Stoute [mailto:dick.sto...@gmail.com]
*Sent:* Monday, December 19, 2016 12:48 PM
*To:* l...@leydesdorff.net
*Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
*Subject:* Re: [Fis] What is information? and What is life?

List,

Please allow me to respond to Loet about the definition of information 
stated below.


1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)


I agree.  I struggled with this definition for a long time before 
realising that Shannon was really discussing "amount of information" 
or the number of bits needed to convey a message.  He was looking for 
a formula that would provide an accurate estimate of the number of 
bits needed to convey a message and realised that the amount of 
information (number of bits) needed to convey a message was dependent 
on the "amount" of uncertainty that had to be eliminated and so he 
equated these.


It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of 
water in liters, but this does not tell us what water is and likewise 
the measure we use for "amount of information" does not tell us what 
information is. We can, for example equate the amount of water needed 
to fill a container with the volume of the container, but we should 
not think that water is therefore identical to an empty volume.  
Similarly we should not think that information is identical to 
uncertainty.


By equating the number of bits needed to convey a message with the 
"amount of uncertainty" that has to be eliminated Shannon, in effect, 
equated opposites so that he could get an estimate of the number of 
bits needed to eliminate the uncertainty.  We should not therefore 
consider that this equation establishes what information is.


Dick

On 18 December 2016 at 15:05, Loet Leydesdorff <l...@leydesdorff.net 
<mailto:l...@leydesdorff.net>> wrote:


Dear James and colleagues,

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contributio

Re: [Fis] What is information? and What is life?

2016-12-19 Thread Bob Logan
Dear Dick - I loved your analysis. You are right on the money. It also explains 
why Shannon dominated the field of information. He had a mathematical formula 
and there is nothing more appealing to a scientist than a mathematical formula. 
But you are right his formula only tells us of how many bits are needed to 
represent some information but tells us nothing about its meaning or its 
significance. As Marshall McLuhan said about Shannon information it is figure 
without ground. A figure only acquires meaning when one understands the ground 
in which it operates. So Shannon’s contribution to engineering is excellent but 
it tells us nothing about its nature or its impact as you wisely pointed out. 
Thanks for your insight.

I would like to refer to your insight the next time I write about info and want 
to attribute you correctly. Can you’ll me a bit about yourself like where you 
do your research. thanks - Bob Logan


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications


On Dec 19, 2016, at 6:48 AM, Dick Stoute  wrote:

List,

Please allow me to respond to Loet about the definition of information stated 
below.  

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)
 
I agree.  I struggled with this definition for a long time before realising 
that Shannon was really discussing "amount of information" or the number of 
bits needed to convey a message.  He was looking for a formula that would 
provide an accurate estimate of the number of bits needed to convey a message 
and realised that the amount of information (number of bits) needed to convey a 
message was dependent on the "amount" of uncertainty that had to be eliminated 
and so he equated these.  

It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of water in 
liters, but this does not tell us what water is and likewise the measure we use 
for "amount of information" does not tell us what information is. We can, for 
example equate the amount of water needed to fill a container with the volume 
of the container, but we should not think that water is therefore identical to 
an empty volume.  Similarly we should not think that information is identical 
to uncertainty.

By equating the number of bits needed to convey a message with the "amount of 
uncertainty" that has to be eliminated Shannon, in effect, equated opposites so 
that he could get an estimate of the number of bits needed to eliminate the 
uncertainty.  We should not therefore consider that this equation establishes 
what information is. 

Dick


On 18 December 2016 at 15:05, Loet Leydesdorff > wrote:
Dear James and colleagues,

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8)

 

The definition of information as relevant for a system of reference confuses 
information with "meaningful information" and thus sacrifices the surplus value 
of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as

 

physical interactions such photons stimulating the retina of the eye, 
human-machine interactions (this is the level that Shannon lives on), 
biological interaction such body temperature relative to touch ice or heat 
source, social interaction such as this forum started by Pedro, economic 
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Otherwise, one cannot measure meaningful 
information. In a previous series of communications we discussed redundancy 
from this perspective.

 

Lerner introduces mathematical expectation E[Sap] (difference between of a 
priory entropy [sic] and a posteriori entropy), which is distinguished from the 
notion of relative information Iap (Learner, page 7).

 

) expresses in bits of information the information generated when the a priori 
distribution is turned into the a posteriori one . This follows within the 
Shannon framework without needing an observer. I use this equation, for 
example, in my 1995-book The Challenge of Scientometrics (Chapters 8 and 9), 
with a reference to Theil (1972). The relative information is defined as the 
H/H(max).

 

I agree that the intuitive notion of information is derived from the Latin 
“in-formare” (Varela, 1979). But most of us do no longer use “force” and “mass” 
in the intuitive (Aristotelian) sense. J The proliferation of the meanings of 
information if confused with 

Re: [Fis] What is information? and What is life?

2016-12-19 Thread Dick Stoute
List,

Please allow me to respond to Loet about the definition of information
stated below.

1. the definition of information as uncertainty is counter-intuitive
("bizarre"); (p. 27)



I agree.  I struggled with this definition for a long time before realising
that Shannon was really discussing "amount of information" or the number of
bits needed to convey a message.  He was looking for a formula that would
provide an accurate estimate of the number of bits needed to convey a
message and realised that the amount of information (number of bits) needed
to convey a message was dependent on the "amount" of uncertainty that had
to be eliminated and so he equated these.


It makes sense to do this, but we must distinguish between "amount of
information" and "information".  For example, we can measure amount of
water in liters, but this does not tell us what water is and likewise the
measure we use for "amount of information" does not tell us what
information is. We can, for example equate the amount of water needed to
fill a container with the volume of the container, but we should not think
that water is therefore identical to an empty volume.  Similarly we should
not think that information is identical to uncertainty.


By equating the number of bits needed to convey a message with the "amount
of uncertainty" that has to be eliminated Shannon, in effect, equated
opposites so that he could get an estimate of the number of bits needed to
eliminate the uncertainty.  We should not therefore consider that this
equation establishes what information is.


Dick


On 18 December 2016 at 15:05, Loet Leydesdorff  wrote:

> Dear James and colleagues,
>
>
>
> Weaver (1949) made two major remarks about his coauthor (Shannon)'s
> contribution:
>
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
> 2. "In particular, information must not be confused with meaning." (p. 8)
>
>
>
> The definition of information as relevant for a system of reference
> confuses information with "meaningful information" and thus sacrifices the
> surplus value of Shannon's counter-intuitive definition.
>
>
>
> information observer
>
>
>
> that integrates interactive processes such as
>
>
>
> physical interactions such photons stimulating the retina of the eye,
> human-machine interactions (this is the level that Shannon lives on),
> biological interaction such body temperature relative to touch ice or heat
> source, social interaction such as this forum started by Pedro, economic
> interaction such as the stock market, ... [Lerner, page 1].
>
>
>
> We are in need of a theory of meaning. Otherwise, one cannot measure
> meaningful information. In a previous series of communications we discussed
> redundancy from this perspective.
>
>
>
> Lerner introduces mathematical expectation E[Sap] (difference between of a
> priory entropy [sic] and a posteriori entropy), which is distinguished from
> the notion of relative information Iap (Learner, page 7).
>
>
>
> ) expresses in bits of information the information generated when the a
> priori distribution is turned into the a posteriori one . This follows
> within the Shannon framework without needing an observer. I use this
> equation, for example, in my 1995-book *The Challenge of Scientometrics*
> (Chapters 8 and 9), with a reference to Theil (1972). The relative
> information is defined as the *H*/*H*(max).
>
>
>
> I agree that the intuitive notion of information is derived from the Latin
> “in-formare” (Varela, 1979). But most of us do no longer use “force” and
> “mass” in the intuitive (Aristotelian) sense. J The proliferation of the
> meanings of information if confused with “meaningful information” is
> indicative for an “index sui et falsi”, in my opinion. The repetitive
> discussion lames the progression at this list. It is “like asking whether a
> glass is half empty or half full” (Hayles, 1990, p. 59).
>
>
>
> This act of forming forming an information process results in the
> construction of an observer that is the owner [holder] of information.
>
>
>
> The system of reference is then no longer the message, but the observer
> who provides meaning to the information (uncertainty). I agree that this is
> a selection process, but the variation first has to be specified
> independently (before it can be selected.
>
>
>
> And Lerner introduces the threshold between objective and subjective
> observes (page 27).   This leads to a consideration selection and
> cooperation that includes entanglement.
>
>
>
> I don’t see a direct relation between information and entanglement. An
> observer can be entangled.
>
>
>
> Best,
>
> Loet
>
>
>
> PS. Pedro: Let me assume that this is my second posting in the week which
> ends tonight. L.
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>


-- 

4 Austin Dr. Prior Park St. James, Barbados 

Re: [Fis] What is information? and What is life?

2016-12-18 Thread Loet Leydesdorff
Dear James and colleagues, 

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s
contribution:

 

1. the definition of information as uncertainty is counter-intuitive
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8) 

 

The definition of information as relevant for a system of reference confuses
information with "meaningful information" and thus sacrifices the surplus
value of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as 

 

physical interactions such photons stimulating the retina of the eye,
human-machine interactions (this is the level that Shannon lives on),
biological interaction such body temperature relative to touch ice or heat
source, social interaction such as this forum started by Pedro, economic
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Otherwise, one cannot measure
meaningful information. In a previous series of communications we discussed
redundancy from this perspective.

 

Lerner introduces mathematical expectation E[Sap] (difference between of a
priory entropy [sic] and a posteriori entropy), which is distinguished from
the notion of relative information Iap (Learner, page 7).

 

) expresses in bits of information the information generated when the a
priori distribution is turned into the a posteriori one . This follows
within the Shannon framework without needing an observer. I use this
equation, for example, in my 1995-book The Challenge of Scientometrics
(Chapters 8 and 9), with a reference to Theil (1972). The relative
information is defined as the H/H(max).

 

I agree that the intuitive notion of information is derived from the Latin
"in-formare" (Varela, 1979). But most of us do no longer use "force" and
"mass" in the intuitive (Aristotelian) sense. J The proliferation of the
meanings of information if confused with "meaningful information" is
indicative for an "index sui et falsi", in my opinion. The repetitive
discussion lames the progression at this list. It is "like asking whether a
glass is half empty or half full" (Hayles, 1990, p. 59). 

 

This act of forming forming an information process results in the
construction of an observer that is the owner [holder] of information.

 

The system of reference is then no longer the message, but the observer who
provides meaning to the information (uncertainty). I agree that this is a
selection process, but the variation first has to be specified independently
(before it can be selected.

 

And Lerner introduces the threshold between objective and subjective
observes (page 27).   This leads to a consideration selection and
cooperation that includes entanglement.

 

I don't see a direct relation between information and entanglement. An
observer can be entangled.

 

Best, 

Loet

 

PS. Pedro: Let me assume that this is my second posting in the week which
ends tonight. L.

 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What is Information? - Propagating Organization in the Biosphere, the Symbolosphere, the Technosphere and the Econosphere

2016-06-17 Thread Moisés André Nisenbaum
Hi, Bob.
It is an awesome book! I read the Portuguese version.
I am happy to know that it is available online for everyone :-)

Kind regards.

Moisés


2016-06-15 18:21 GMT-03:00 Bob Logan :

> Dear FIS colleagues - I received three complimentary emails re my paper,
> Propagating Organization: An Enquiry, the paper I wrote with Stuart
> Kauffman and others and which I shared with the list. As a result  I
> thought some of you might be interested in the book I wrote based on that
> paper entitled *What is Information? - Propagating Organization in the
> Biosphere, the Symbolosphere, the Technosphere and the Econosphere *with
> a foreword written by Terry Deacon.
>
> The ebook version of the books available for free at demopublishing.com.
> Please feel free to have a look at it and grab a copy of it - Thanks - Bob
> Logan
>
>
> __
>
> Robert K. Logan
> Prof. Emeritus - Physics - U. of Toronto
> Fellow University of St. Michael's College
> Chief Scientist - sLab at OCAD
> http://utoronto.academia.edu/RobertKLogan
> www.physics.utoronto.ca/Members/logan
> www.researchgate.net/profile/Robert_Logan5/publications
>
>
>
>
>
>
>
>
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>


-- 
Moisés André Nisenbaum
Doutorando IBICT/UFRJ. Professor. Msc.
Instituto Federal do Rio de Janeiro - IFRJ
Campus Rio de Janeiro
moises.nisenb...@ifrj.edu.br
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] What is Information? - Propagating Organization in the Biosphere, the Symbolosphere, the Technosphere and the Econosphere

2016-06-15 Thread Bob Logan
Dear FIS colleagues - I received three complimentary emails re my paper, 
Propagating Organization: An Enquiry, the paper I wrote with Stuart Kauffman 
and others and which I shared with the list. As a result  I thought some of you 
might be interested in the book I wrote based on that paper entitled What is 
Information? - Propagating Organization in the Biosphere, the Symbolosphere, 
the Technosphere and the Econosphere with a foreword written by Terry Deacon.

The ebook version of the books available for free at demopublishing.com 
. Please feel free to have a look at it and grab a 
copy of it - Thanks - Bob Logan


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications










___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] what is information

2015-10-03 Thread Emanuel Diamant
Dear all,

 

I apologize for the delayed response and fragmented personal replies. 

 

I apologize that not all of your responses were selected for further
comments, only those that were the first to come and those that look as the
most relevant ones.

 

I apologize that not all topics of these responses are regarded in my
answers, only these that look to me as the most relevant ones.

  

I apologize that the style of my answers is not always elegant and polite,
(that is one of my problems), but the reason is not my bad character or bad
temper, the reason is my bad knowledge of English. Sorry.

 

And here are the replies:  

 

=  

To Krassimir Markov, (September 24, 2015).

 

Thank you for your response. Yet I don’t like it – you force me to repeat
commonplace banalities again and again. I have no stomach for this sort of
things, but you insist – and so I have to repeat:

 

Every scientific discourse or dialogue begins with establishing some initial
basic assumptions, which do not need to be solid or substantial (therefore,
I dare to call them axioms). In the course of a subsequent reasoning, the
initial assumptions are being transformed into a set of hypotheses, which
are being applied to explain the existence or to predict results of
observations of some natural phenomenon. If the hypotheses successfully pass
this trial, the initial assumptions become regarded as true and sustainable.
Then the next round of hypotheses ramification and complication comes into
being until a full-blown new theory has become available.

 

This is the way of thinking and reasoning which I am familiar with. Your
question “What is data?” does not meet the conventions of such a discourse.
Reading some of your previously published papers, I can guess what are the
basic assumptions that you adhere to. But you do not declare them by
yourself. Why should I do that instead of you? Do that, defend their
validity by making prediction tests (as it is described above), and do not
ask “smart” questions.

 

Meanwhile I will ignore and discard your response. 

 

  

To Günther Witzany, (September 24, 2015).

 

Thank you very much for your response. I am a great admirer of your
publications. Long before they become available on the Research Gate, I was
really a stubborn hunter for them, happy with any piece of text (mainly
abstracts) that I was lucky to catch (I am an engineer and thus biological
publications usually are not accessible to me). 

 

As it was just stated above (in my answer to Krassimir), the only way to
develop a new theory is to validate and justify its initial assumptions by
applying them to an explanatory description of an observable natural
phenomenon. In this regard, your publications were exactly what I was
needed. (Although long before your papers become available to me, I gained
my inspirations from the papers of Eshel Ben-Jacob, published a couple of
years earlier). Never mind, his and your explorations were thought
provoking, inspiring and really helpful to me. However, some reservations
regarding their subject would be worth to mention here.

 

In Eshel’s and your early publications, the term “model”/“modeling” is often
encountered on various occasions. In contemporary science, the term has a
pure mathematical connotation and usually implies data modeling. What
follows from this is that “modeling” (and your engagement with it)
explicitly presumes mandatory data processing. In the context of my
assumptions-speculations, data processing is tightly bound to physical
information processing. And that is the point – you all all the time are
busy with physical information processing, despite you even don’t recognize
the notion of it. (Semantic information processing also remains a terra
incognita for you). 

 

In this regard, your description of the marvels and the mysteries of the
“chemical Auxin” are missing their argument strength. First, Auxin is not a
chemical (as you call it); it is a plant hormone (Wikipedia), a hormonal
messenger. The message that a messenger is carrying is actually a piece of
text written in a language (which is still unknown to us) with a chemical
alphabet (that is also unknown to us). The message as a rule contains
physical and semantic information subparts (text pieces).

 

I apologize for drawing you into this mishmash, but the time is ripe to
clarify some of the matters.

For a long time, you Gunter, on various occasions assert the following
statement: 

 

“The Modern Synthesis regards the genetic code as a lineup of molecules that
can be investigated through physics and chemistry and mathematics (for
sure). And this is it. However, we know the genetic content of organisms is
about communication, which cannot be reduced simply to physics and
chemistry. Since the 1970s, Manfred Eigen has insisted that the genetic code
is really a language, not just metaphorically, but a real language”. (In an

[Fis] What are information and science?

2015-05-23 Thread Marcus Abundis
Dear Colleagues,

Re Pedro's point and other related postings . . .

 I would never bet for a new info-reductionism, or explanatory monism,
science is an elegant Babel construction always condemned --or enjoying--
the plurality of disciplinary languages and views.

I echo the questions around communication and information at different
levels, and BETWEEN different systems/levels – this further takes me back
to a point I raised at the end of Deacon's last (second) session, but that
was not really addressed. This has to do with the nature of emergent things
BETWEEN systems (or levels of analysis). This proved (in his sessions) to
be a rather chronic issue in trying to grasp/convey Deacon's modeling – or
now, even in modeling an effective FIS(?). The Deacon session didn't really
seem to land anywhere (re Pedro's condemned/enjoyed Babel) . . . and
here we are again, no?

So I am now left to wonder if we are to accept the futility of
modeling emergent things (which seems to be a critical deeper issue), that
might otherwise offer a bridge between systems/levels, or if that
imagined/impossible(?) new info-reductionism, or explanatory monism is to
be actively attempted and explored here? As a new member, I simply wish to
know what might be reasonably tolerated.

Thanks to all for your earlier thoughts!

[image: --]
Marcus Abundis
[image: http://]about.me/marcus.abundis
http://about.me/marcus.abundis?promo=email_sig
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What are information and science?

2015-05-20 Thread Loet Leydesdorff
Dear colleagues, 

 

 

I see informational processes as essentially being proto-scientific – how is 
any science not an informational process? 

 

The sciences, in my opinion, are different in terms of what is communicated. As 
Maturana noted, the communication of molecules generates a biology. Similarly, 
the communication of atoms generates a chemistry, etc. The communication of 
words and sentences generates the interhuman domain of communication. One can 
also communicate in terms of symbolic media such as money. This can be 
reflected by economics.

 

Thus, the sciences are different. The formal perspective (of the mathematical 
theory of communication) provides us with tools to move metaphors heuristically 
from one domain to another. The assumption that the mathematics is general is 
over-stated, in my opinion. One has to carefully check and elaborate after each 
translation from one domain to another. In this sense, I agree with 
“proto-scientific”.

 

Best,

Loet

 

 

First, I think this places me in the camp of Peirce's view. Second, I am unsure 
of how to regard the focus on higher-order interdisciplinary discussions when 
a much more essential view of lower-order roles (i.e., What are science and 
information?) has not been first established.

 

From my naive view I find myself wondering how informational process is 
not the ONE overarching discipline from which all other disciplines are born 
(is this too psychological of a framework?). As such, I argue for one great 
discipline . . . and thus wouldn't try to frame my view in terms of science, 
mostly because I am unclear on how the term science is being formally used 
here. Thoughts?



 



Marcus Abundis

about.me/marcus.abundis


  http://d13pix9kaak6wt.cloudfront.net/signature/colorbar.png 

  



 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What are information and science?

2015-05-20 Thread Dai Griffiths
Thanks Loet, that is helpful, and makes intuitively good sense. But I 
remain puzzled. I see two distinct cases:


Case 1: For molecules 'communication' consists of interaction between 
the molecules themselves, resulting in biology.
Similarly, for atoms 'communication' consists of interaction between the 
atoms themselves. They bang into each other and exchange their components.


Case 2: For words and sentences (in my view of the world) it is human 
beings who communicate, not words and sentences. From a Maturana 
perspective, language is a recursive coordination between autopoietic 
entities, not interaction between linguistic items.


In case 1, there is no mediating domain. Molecules and atoms interact 
directly.


But in case 2, there is a hierarchy. Communication is between human 
beings, but interaction is through words and sentences in a linguistic 
domain. When I respond to your email, I do not have an effect on that 
email. Rather, I hope to have an effect on your thought processes.


Of course there are other interactions between people which correspond 
to my case 1, for example when someone barges another person out of the 
way, or when they dance together. But I think Maturana would distinguish 
these examples by describing them in terms of structural coupling rather 
than languaging.


By calling both of these cases 'communication' we gain some valuable 
traction on patterns of interaction in different domains. But I am 
concerned that we also make it more difficult to disentangle our idea of 
what information is, by equating it with a catch-all notion of 
'communication'.


Dai


On 20/05/15 11:12, Loet Leydesdorff wrote:


Dear colleagues,

I see informational processes as essentially being proto-scientific 
– how is any science not an informational process?


The sciences, in my opinion, are different in terms of what is 
communicated. As Maturana noted, the communication of molecules 
generates a biology. Similarly, the communication of atoms generates a 
chemistry, etc. The communication of words and sentences generates the 
interhuman domain of communication. One can also communicate in terms 
of symbolic media such as money. This can be reflected by economics.


Thus, the sciences are different. The formal perspective (of the 
mathematical theory of communication) provides us with tools to move 
metaphors heuristically from one domain to another. The assumption 
that the mathematics is general is over-stated, in my opinion. One has 
to carefully check and elaborate after each translation from one 
domain to another. In this sense, I agree with “proto-scientific”.


Best,

Loet

First, I think this places me in the camp of Peirce's view. Second, I 
am unsure of how to regard the focus on higher-order 
interdisciplinary discussions when a much more essential view of 
lower-order roles (i.e., What are science and information?) has not 
been first established.


From my naive view I find myself wondering how informational 
process is not the ONE overarching discipline from which all other 
disciplines are born (is this too psychological of a framework?). As 
such, I argue for one great discipline . . . and thus wouldn't try to 
frame my view in terms of science, mostly because I am unclear on 
how the term science is being formally used here. Thoughts?


*Marcus Abundis*

about.me/marcus.abundis




___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


--
-

Professor David (Dai) Griffiths

Professor of Educational Cybernetics
Institute for Educational Cybernetics (IEC)
The University of Bolton
http://www.bolton.ac.uk/IEC

SKYPE: daigriffiths
UK Mobile: + 44 (0)7826917705
Spanish Mobile: + 34 687955912
email: dai.griffith...@gmail.com

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] What are information and science?

2015-05-20 Thread Marcus Abundis
Greetings to all,

As I read these comments I have a hard time finding an effective
anchor upon which to add notes. I see informational processes as
essentially being proto-scientific – how is any science not an
informational process? First, I think this places me in the camp of
Peirce's view. Second, I am unsure of how to regard the focus on
higher-order interdisciplinary discussions when a much more essential
view of lower-order roles (i.e., What are science and information?) has not
been first established.

From my naive view I find myself wondering how informational
process is not the ONE overarching discipline from which all other
disciplines are born (is this too psychological of a framework?). As
such, I argue for one great discipline . . . and thus wouldn't try to frame
my view in terms of science, mostly because I am unclear on how the term
science is being formally used here. Thoughts?

[image: --]
Marcus Abundis
[image: http://]about.me/marcus.abundis
http://about.me/marcus.abundis?promo=email_sig
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What are information and science?

2015-05-20 Thread Dai Griffiths

Thanks Robert,

I agree with what you say about DNA, so I may be on the same slippery 
path to catastrophic heterodoxy!


In responding to the question what is information, started by Marcus, 
I was pointing out what seemed to me to be a shifting definition of 
'communication', and wondering if this corresponded to a shifting 
definition of 'information'.


Loet stated that the communication of words and sentences generates the 
interhuman domain of communication. I am not taking issue with this. My 
question is whether we are using the word 'communication' and 'generate' 
in the same sense when we also say the communication of molecules 
generates a biology.


Your comments raise a related question. Perhaps it is not that molecules 
generate biology, but rather it is that biology (in the shape of the 
network of proteomic and enzymatic reactions) generates the 
communication of molecules?


Perhaps the problem is one of keeping track of the system in focus, and 
demarcating it clearly (as Stafford Beer might have argued at this 
juncture).


Dai

On 20/05/15 16:05, Robert E. Ulanowicz wrote:

Dear Dai:

To say that molecules only interact directly is to ignore the metabolic
matrix that constitutes the actual agency in living systems. For example,
we read everywhere how DNA/RNA directs development, when the molecule
itself is a passive material cause. It is the network of proteomic and
enzymatic reactions that actually reads, interprets and edits the
primitive genome. Furthermore, the structure of that reaction complex
possesses measurable information (and complementary flexibility).

Life is not just molecules banging into one another. That's a physicist's
(Lucreatian) view of the world born of models that are rarefied,
homogeneous and (at most) weakly interacting. (Popper calls them vacuum
systems.) The irony is that that's not how the cosmos came at us! Vacuum
systems never appeared until way late in the evolution of the cosmos. So
the Lucreatian perspective is one of the worst ways to try to make sense
of life. We need to develop a perspective that parallels cosmic evolution,
not points in the opposite direction. To do so requires that we shift from
objects moving according to universal laws to processes giving rise to
other processes (and structures along the way).

The contrast is most vividly illustrated in reference to the origin of
life. Conventional metaphysics requires us to focus on molecules, whereby
the *belief* is that at some point the molecules will miraculously jump up
and start to live (like the vision of the Hebrew prophet Ezekiel). A
process-oriented scenario would consist of a spatially large cycle of
complementary processes (e.g., oxidation and reduction) that constitutes a
thermodynamic work cycle. Those processes then can give rise to and
support smaller cycles, which eventually develop into something resembling
metabolic systems. A far more consistent progression!

Of course, this view is considered catastrophically heterodox, so please
don't repeat it if you don't already have tenure. ;-)

Peace,
Bob U.


  I see two distinct cases:

Case 1: For molecules 'communication' consists of interaction between
the molecules themselves, resulting in biology.
Similarly, for atoms 'communication' consists of interaction between the
atoms themselves. They bang into each other and exchange their components.

Case 2: For words and sentences (in my view of the world) it is human
beings who communicate, not words and sentences. From a Maturana
perspective, language is a recursive coordination between autopoietic
entities, not interaction between linguistic items.

In case 1, there is no mediating domain. Molecules and atoms interact
directly.

But in case 2, there is a hierarchy. Communication is between human
beings, but interaction is through words and sentences in a linguistic
domain. When I respond to your email, I do not have an effect on that
email. Rather, I hope to have an effect on your thought processes.





--
-

Professor David (Dai) Griffiths

Professor of Educational Cybernetics
Institute for Educational Cybernetics (IEC)
The University of Bolton
http://www.bolton.ac.uk/IEC

SKYPE: daigriffiths
UK Mobile: + 44 (0)7826917705
Spanish Mobile: + 34 687955912
email: dai.griffith...@gmail.com

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What are information and science?

2015-05-20 Thread Robert E. Ulanowicz
Dear Dai:

To say that molecules only interact directly is to ignore the metabolic
matrix that constitutes the actual agency in living systems. For example,
we read everywhere how DNA/RNA directs development, when the molecule
itself is a passive material cause. It is the network of proteomic and
enzymatic reactions that actually reads, interprets and edits the
primitive genome. Furthermore, the structure of that reaction complex
possesses measurable information (and complementary flexibility).

Life is not just molecules banging into one another. That's a physicist's
(Lucreatian) view of the world born of models that are rarefied,
homogeneous and (at most) weakly interacting. (Popper calls them vacuum
systems.) The irony is that that's not how the cosmos came at us! Vacuum
systems never appeared until way late in the evolution of the cosmos. So
the Lucreatian perspective is one of the worst ways to try to make sense
of life. We need to develop a perspective that parallels cosmic evolution,
not points in the opposite direction. To do so requires that we shift from
objects moving according to universal laws to processes giving rise to
other processes (and structures along the way).

The contrast is most vividly illustrated in reference to the origin of
life. Conventional metaphysics requires us to focus on molecules, whereby
the *belief* is that at some point the molecules will miraculously jump up
and start to live (like the vision of the Hebrew prophet Ezekiel). A
process-oriented scenario would consist of a spatially large cycle of
complementary processes (e.g., oxidation and reduction) that constitutes a
thermodynamic work cycle. Those processes then can give rise to and
support smaller cycles, which eventually develop into something resembling
metabolic systems. A far more consistent progression!

Of course, this view is considered catastrophically heterodox, so please
don't repeat it if you don't already have tenure. ;-)

Peace,
Bob U.

  I see two distinct cases:

 Case 1: For molecules 'communication' consists of interaction between
 the molecules themselves, resulting in biology.
 Similarly, for atoms 'communication' consists of interaction between the
 atoms themselves. They bang into each other and exchange their components.

 Case 2: For words and sentences (in my view of the world) it is human
 beings who communicate, not words and sentences. From a Maturana
 perspective, language is a recursive coordination between autopoietic
 entities, not interaction between linguistic items.

 In case 1, there is no mediating domain. Molecules and atoms interact
 directly.

 But in case 2, there is a hierarchy. Communication is between human
 beings, but interaction is through words and sentences in a linguistic
 domain. When I respond to your email, I do not have an effect on that
 email. Rather, I hope to have an effect on your thought processes.


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis