Re: [Fis] info & meaning

2007-10-15 Thread John Collier

At 01:36 PM 14/10/2007, Michel Petitjean wrote:

To: fis@listas.unizar.es
Subject: [Fis] Re: info & meaning

bob logan <[EMAIL PROTECTED]> wrote:
> Loet et al - I guess I am not convinced that information and entropy
> are connected. Entropy in physics has the dimension of energy divided
> by temperature. Shannon entropy has no physical dimension - it is
> missing the Boltzman constant. Therefore how can entropy and shannon
> entropy be compared yet alone connected?

As indicated on http://en.wikipedia.org/wiki/Shannon_entropy
about the relationship of information entropy to thermodynamic entropy:
<< The inspiration for adopting the word entropy in information theory
came from the close resemblance between Shannon's formula and very similar
known formulae from thermodynamics. >>
In other words, the link between information entropy to thermodynamic entropy
is just a formal analogy between equations in two mathematical models:
one of these models is applied to physical science, the other is applied
to communication science.
I cannot see more connection between these entropies,
but may be some other people do.


The connection is spelled out in some detail in Leon Brillouin,
Science and Information Theory, 2nd edition 1962, Academic
Press. I believe there is an edition still in print. Also see the
book by my student Scott Muller, that I mentioned in my second to
last post. Information, as Schroedinger made clear, is a measure
of negative entropy (complement of entropy, Hmax - Hact) if
you pay attention to physical embodiment. It is not entropy, but
closely related. Of course if you ignore physical embodiment, then
all bets are off, but in that case I, personally, would not know
what you are talking about.

Talking a bout information, or meaning for that matter, without
respect for its embodiment is basically bullshit (or it is mathematics,
which is OK, but not informative, as it is all tautological). So
let's dump the non-precise tautologies and start talking about
something! Shannon's work gave us a useful mathematical
tool for discussing the capacity of a channel. Barwise and
Seligman have more recently given a useful mathematical
tool for describing the properties of a channel. Kolmogorov
and Chaitin have given use useful mathematical tools for
connecting information to logic and probability. Ingarden
has given more abstract tools for combining the various
ideas and grounding probability in information and logic.
Shannon never even told us what probability is, though his
approach works on any reasonable account. None of these
mathematical tools in themselves can give us a scientific
theory of information, however, since they are all tautological.
At best they give more or less compatible formal definitions.
But to know what information is we need an interpretation
that fits all of these approaches (or at least gives a good
reason why it doesn't and shouldn't).

Formal theories tell us nothing. They are tautological,
as are all formal definitions. In particular, they cannot tell
us how their interpretations in the real world are connected
or if they are. There is no a priori reason why two models
of Shannon's theory have anything to do with each other
beyond formal analogue, let alone whether information
theory and statistical mechanics have anything to
do with each other because they share formal aspects.
Mathematics is incapable by its nature of telling us
if any two things are connected in any substantial
way, so independence of development, or common
development, can't tell us anything one way or the
other about the connections in the world.

In this line, for example, the Barwise an Seligman
give us a formalism that describes an information
channel. However, since causal connection
cannot be formalised, they use regularity as a
substitute. But some regularities are coincidental,
and others are epiphenomenal -- neither of these
can bear information, despite satisfying the
B&S formalism. The same is true of Shannon's
formalism, incidentally. We must pay attention
to how things are embodied in order to get applications
of either formalism right. Fortunately this is not
so hard. But then we can't just ignore this aspect
of the applications of formalisms when we look at
how different formalism are related to each other,
especially if we do anything other than claim
agnosticism.

It is possible to apply Shannon's communication
theory in ways that are not compatible with
statistical mechanics. It is also possible,
however, to apply Shannon's theory in ways
that are not compatible with each other (in
the same system, nonetheless). This tells us
nothing about how we should apply these theories
or whether they are connected to each other.
They are tautologies. They don't come with
intended applications. These have to be discovered,
not predefined. It is a substantive, empirical issue.
It can't be decided on the basis of knowledge of
independent applications compiled in the Wikipedia
or anywhere else. We have to

[Fis] info & meaning

2007-10-15 Thread Rafael Capurro

Hi,

instead of discussing endlessly whether "information is this or that" 
wouldn't it more productive to say: in physics (or chemistry or biology...) 
there is a phenomenon X that we (whoever this "we" is) call information?


I have the impression that otherwise we are hostages of a Platonic or 
essentialist discussion


kind regards

Rafael

--
Prof. Dr. Rafael Capurro
Hochschule der Medien (HdM) - Stuttgart Media University, Wolframstr. 32
70191 Stuttgart, Germany
Private: Redtenbacherstr. 9, 76133 Karlsruhe, Germany
E-Mail: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Voice Stuttgart: + 49 - 711 - 25706 - 182
Voice private: + 49 - 721 - 98 22 9 - 22 (Fax: -21)
Homepage: www.capurro.de
Homepage ICIE: http://icie.zkm.de
Homepage IRIE: http://www.i-r-i-e.net
Information Ethics Senior Fellow, 2007-2008, Center for Information Policy 
Research, School of Information Studies, UW-Milwaukee, USA


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


[Fis] info & meaning

2007-10-14 Thread Stanley N. Salthe
Referring to Bob's, John's, Loet's, Guy's, Beth's and Marcin's
Bob Logan said:

>>>BL: The Relativity of Information
>>>In POE we associated biotic or instructional information with the
>>>organization that a biotic agent is able to propagate. This contradicts
>>>Shannon's definition of information and the notion that a random set or
>>>soup of organic chemicals has more Shannon information than a structured
>>>and organized set of organic chemicals found in a living organism.
>>S: This 'notion' would be valid IF the comparison was strictly between a
>>soup of subatomic particles and the arrangement of subatomic particles
>>entrained by a biotic organization.
>BL: The point I am making is that [biological] organization is a form of
>information not taken into account by Shannon
 S: But, this organization has its own behavioral variety at a larger
scale than the soup of organic chemicals entrained by this organization.
The chemicals' variety of positions and conformations at the smaller scale
would be constrained within the biological system, but that system itself
could be subject to analysis of its number of (in this case meaningful)
'complexions'. (As John points out, the meaningfulness results from
interpretation, or by way of Loet's 'system of reference'.)  I suppose this
is referred to in Loet's

>In the case of the random soup of organic chemicals, the maximum
>entropy of the systems is set by the number of chemicals involved (N).
>The maximum entropy is therefore log(N). (Because of the randomness of
>the soup , the Shannon entropy will not be much lower.)
>
>If a grouping variable with M categories is added the maximum entropy
>is log(N * M). Ceteris paribus, the redundancy in the system increases
>and the Shannon entropy can be expected to decrease.
  S: But suppose that the number of M categories is very large?
Consider how many conformations a human being could assume, then observe a
population, recording from the outside ithe positions and conformations of
its members.  Even if we did not know the meanings, we could in principle
make a Shannon type analysis.  This is allowed, I think because there is no
logical way to distinguish from the outside between a random event and an
arbitrary act (Whitehead).  Thus, we could calculate the system's
information carrying capacity without knowing the various meanings being
conveyed, which would appear random to us.  Or, if we knew the meanings
that the system generated by way of interpretation, we could calculate this
lower carrying capacity as well.

Bob further said:

>Loet et al - I guess I am not convinced that information and entropy
>are connected. Entropy in physics has the dimension of energy divided
>by temperature. Shannon entropy has no physical dimension - it is
>missing the Boltzman constant. Therefore how can entropy and shannon
>entropy be compared yet alone connected?
 S: The relations here are {Shannon {Boltzmann}}.  That is,
conceptually, Boltzmann is logically a refinement of Shannon, which is a
generalized Boltzmann.

>I am talking about information not entropy - an organized collection
>of organic chemicals must have more meaningful info than an
>unorganized collection of the same chemicals.
  S:  Both the unorganized ensemble and an organized ensemble at a
different scale would have an information carrying capacity (informational
entropy), H.  This is what I think Guy refers to in his 'commonality' --
>I think that many of us suspect that this is the case and we are striving
>to understand and articulate (model) the >fundamental commonality.
I don't know which would be greater (Loet may tell us), but it could in
principle be calculated on either one.  AND, one could try to calculate the
number of MEANINGFUL conformations the organized kinds would be producing
if we knew the interpretive scheme.

Returning to finalities, Beth said:
>Part of what makes a story engaging are these novel,
>"non-reinforcing" elements. This relates to Stan's comment:
>>If resonant inputs to a system are nonreinforcing, they contradict
>>a system's finalities, and will then elicit learning or avoidance.
 S:  Note that the finalities are, however, not negated (transcended).
Contradiction might even highlight or reinforce whatever finalities are at
play.
 With respect to the kinds of meanings Beth refers to, we could
consider modern poetry (e.g. Mallarme, Eliot), where meanings are vague and
allusive, implied and polysemic.  I myself played this game for while, and
I can say that the words per se were not relied upon to convey meaning, but
I also used dissonance, rhythm, internal rhyming, punning, archaeisms,
sudden changes in scale and gender -- tools that have been despised in
contempoary poetry.   In this realm Shannon is completely defeated.

Marcin said:
>If information is not physical, and therefore governed by physical
>principles, then what is its ontological status?
 I would attempt to answer this by way of a hierarchy of l

Re: [Fis] info & meaning

2007-10-13 Thread Loet Leydesdorff
On 10/12/07, bob logan <[EMAIL PROTECTED]> wrote:

> Loet et al - I guess I am not convinced that information and entropy
> are connected. Entropy in physics has the dimension of energy divided
> by temperature. Shannon entropy has no physical dimension - it is
> missing the Boltzman constant. Therefore how can entropy and shannon
> entropy be compared yet alone connected?


Dear Bob:

I agree. Thermodynamic entropy has a physical meaning, while probabilistic
entropy is a purely mathematical concept. A mathematical concept is formal;
bits are dimensionless. Probabilistic entropy, Shannon-type information and
uncertainty are different words for the same concept.

I am talking about information not entropy - an organized collection
of organic chemicals must have more meaningful info than an
unorganized collection of the same chemicals.
However, in this next paragraph, you shift to "meaningful information".
Meaning can only be provided to the (Shannon-type) information by a system
of reference. The differences (i.e., the distribution can then make a
difference for this observing system.

Let's elaborate the "difference which makes a difference." For example, the
first difference is "on/off" or [0,1]. For the purpose of the example, let's
assume that the second difference is [1,0]. Cross-tabulation leads to a
matrix with four possible combinations. More generally: if we have N classes
in the first distribution and M classes in the second, we obtain a matrix
with N x M classes and hence a maximum (Shannon) entropy of log(N x M).

The difference which not yet made a difference, that is, the Shannon-type
information of the random soup had only N classes. (N would be the number of
different chemical molecules in the soup). The organization in the living
organism has increased the redundancy with log(M), and therefore the
Shannon-type information has decreased.

This is just a straightforward answer on your question in a previous email.
Organization decreases the expected information content of the distribution
because a range of new possibilities is made available by adding a second
dimension or --as John Collier mentioned it-- a second degree of freedom.
Organization can always be written as an organization of a distribution
which was previously unorganized. The matrix contains more redundancy than
the vector.

The advantage of this approach is that it remains mathematical and can be
provided with an appreciation in different discourses. For example, one
expects the appreciation in biological discourse to be different from the
appreciation in economics. It seems to me that your definition of
organization is *a priori *biological. This is unnecessarily reductionistic.
Biology is a special theory which provides us with heuristics to study
fields like the social sciences. These heuristics can be formalized by using
a non-substantive, but formal apparatus. This enables us to specify the
differences in the non-linear dynamics of different systems of reference.

Furthermore, I would not know how to measure a "difference which makes a
difference" if it were not in this way using probabilistic entropy as a
methodology. I did not get that from your paper.

Let me add for the good order that I heavily lean in the above on Henry
Theil's *Statistical Decomposition Analysis *(Amsterdam: North Holland,
1972) and Brooks & Wiley's (1986) *Evolution as Entropy*.

Best wishes,  Loet

On 11-Oct-07, at 5:34 PM, Loet Leydesdorff wrote:

>> Loet - if your claim is true then how do you explain that a random
>> soup of
>> organic chemicals have more Shannon info than an equal number of
>> organic
>> chemicals organized as a living cell where knowledge of some
>> chemicals
>> automatically implies the presence of others and hence have less
>> surprise
>> than those of the  soup of random organic chemicals? -  Bob
>
> Dear Bob and colleagues,
>
> In the case of the random soup of organic chemicals, the maximum
> entropy of the systems is set by the number of chemicals involved (N).
> The maximum entropy is therefore log(N). (Because of the randomness of
> the soup , the Shannon entropy will not be much lower.)
>
> If a grouping variable with M categories is added the maximum entropy
> is log(N * M). Ceteris paribus, the redundancy in the system increases
> and the Shannon entropy can be expected to decrease.
>
> In class, I sometimes use the example of comparing Calcutta with New
> York in terms of sustainability. Both have a similar number of
> inhabitants, but the organization of New York is more complex to the
> extent that the value of the grouping variables (the systems of
> communication) becomes more important than the grouped variable (N).
> When M is extended to M+1, N possibilities are added.
>
> I hope that this is convincing or provoking your next reaction.
>
> Best wishes,
>
>
> Loet





-- 
Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fa

[Fis] Info, meaning & narrative

2007-10-12 Thread Ted Goranson
Message forwarded from Beth Cardier. I understand It should have 
appeared a few days ago but for subscription difficulties.


Her email is [EMAIL PROTECTED]

-Ted
+++

Dear FIS colleagues,

My name is Beth Cardier, and I am currently researching "narrative as 
information" at Melbourne University. I haven't contributed to this 
list before but I have been watching it since I attended the FIS 
conference in Paris. In response to Ted's call for a hybrid space, I 
thought I'd share some of the ways that the ideas raised in this 
thread overlap with a few key concepts from the arts, as well as some 
of ways the logics differ. I also happen to know a lot about 
newspapers, which have recently entered the discussion.


As a writer and media analyst, I don't deal with the conveyance of 
meaning numerically, but I suspect there could be a geometric way to 
understand narrative. For me, it seems that the first difference 
between an artistic and scientific notions of information is that 
artistic logic doesn't operate in terms of context and content n 
there is no container and object. Perhaps linguistic principles are 
responsible for the traditional idea of object and link, noun and 
verb. Instead, storytelling deals in "situations," carrying an 
ontology in which participants are elements of context, because they 
compose the situation.


I define a "situation" as you would expect, as a cluster of embodied 
spaces in the actual world. In a story, aspects of a situation are 
represented as "images." Images are promiscuous in terms of what they 
can represent - not only tangible objects, but emotional sensations, 
philosophical positions, abstract impressions or metaphysical 
wonderings. They can be expressed via nouns or verbs, or both, or 
neither. They can also be expressed in the way the words are placed 
in relation to each other, or the frequency with which certain words 
occur. This is why linguistic descriptions of how words are related 
are not as important to me as understanding how images are related. 
My currency as a storyteller n the way I create "meaning" n lies in 
the association of images. I fit them together to create represented 
situations. Images are clustered into networks, like stones that join 
to form virtual pavements, interpermeated and overlapping.


The situated nature of narrative information means that it is 
conditional, and no element is absolute in its identity. Indeed, the 
relativity of information is the subject of this FIS thread. But 
there is an additional quality to artistic information that you might 
find interesting. In narrative, when information is added 
incrementally, the identity of the entire system changes, and so 
isn't reducible to its parts. Let me give a crude example from Media 
Analysis, an activity that mines news reports for dominant narrative 
patterns. It should be noted that this example is noun-oriented for 
ease.


When an analyst determines the public relations implications of a 
news story, one method is to look at the images present. For example, 
a news article that contains the words "president," "car accident" 
and "broken leg" conveys one sort of story (1). An article containing 
the words "president," "car accident," "broken leg," and "woman 
passenger" is a slightly different story (2). Watch how the story 
changes when the list of images becomes: president, car accident, 
broken leg, woman passenger, woman not president's wife, woman dead 
(3).


Even though these reports share identical elements, they have three 
different "meanings," and not only because the story is longer. For 
example, in each version, the role played by the image "broken leg" 
changes. In story 1, concern about the president's broken leg relates 
to the prestige and importance of the office he holds. In story 2, 
there are still suggestions of this concern, but the broken leg is 
also the reason we know about his trip with a female passenger (an 
analyst would recommend PR intervention). In story 3, the president 
is responsible for such a bad situation that the broken leg is almost 
a deserved punishment, the beginning of a much larger punishment to 
come (an analyst would recommend a lawyer).


Of course there can be slightly different readings for these three 
stories, depending on the finer links between their images, although 
that sort of variation tends to be more common in fiction. There is 
actually not much variety in news reports, their recycled templates 
being part of what makes them fast to write and read.


This begins to relate to Steven's and Christophe's observations about 
newspapers and interpretation. As you might guess, I see pattern as 
an agent in terms of information and meaning. In artistic terms, two 
patterns draw out each other's shared features simply by having 
similar structures. Perhaps this also speaks to Walter's thoughts 
about information transference. If I tell a story about a dying 
friend, and you have lost a friend in a simila

Re: [Fis] info & meaning

2007-10-12 Thread John Collier

At 09:01 PM 12/10/2007, bob logan wrote:

Loet et al - I guess I am not convinced that information and entropy
are connected. Entropy in physics has the dimension of energy divided
by temperature. Shannon entropy has no physical dimension - it is
missing the Boltzman constant. Therefore how can entropy and shannon
entropy be compared yet alone connected?


Bob,

Temperature has the dimensions of energy per degree of freedom.
Do the dimensional analysis, and you end up with a measure in
degrees of freedom. This is a very reasonable dimensionality
for information.


I am talking about information not entropy - an organized collection
of organic chemicals must have more meaningful info than an
unorganized collection of the same chemicals.


I am planning to make some general comments of meaning, but
I am too busy right now. They will have to wait for later. There are
some very tricky issues involved, but I will say right now that
information is not meaningful, but has only a potential for meaning.
All information must  be interpreted to be meaningful, and the same
information can have very different meanings depending on how it
is interpreted. Information, on the other hand, has an objective
measure independent of interpretation, and that depends on
the measure of asymmetry within a system. See the recent book
by my student Scott Muller for details,  Asymmetry: The Foundation
of Information, Springer 2007.

http://www.amazon.ca/Asymmetry-Foundation-Information-Scott-Muller/dp/3540698833

This whole discussion on meaning needs far more precision and a
lot of garbage collecting.

Cheers,
John




--
Professor John Collier [EMAIL PROTECTED]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://www.ukzn.ac.za/undphil/collier/index.html  


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-12 Thread Guy A Hoelzer
Bob,

If the notions of Entropy and Shannon Information are alternative approaches
to characterize the same phenomenon in Nature, then the ways they have been
modeled would not necessarily reveal an underlying and fundamental
commonality.  I think that many of us suspect that this is the case and we
are striving to understand and articulate (model) the fundamental
commonality.  We may be wrong, of course, but your argument doesn't dissuade
me.  In fact, I must admit sheepishly that I'm not sure how one would go
about analyzing the relationship between these ideas in a way that could
dissuade me.  If such an analysis is not possible, then I suspect that the
question of a fundamental commonality will have to die slowly as progress
fails to occur.

Regards,

Guy


on 10/12/07 12:01 PM, bob logan at [EMAIL PROTECTED] wrote:

> Loet et al - I guess I am not convinced that information and entropy
> are connected. Entropy in physics has the dimension of energy divided
> by temperature. Shannon entropy has no physical dimension - it is
> missing the Boltzman constant. Therefore how can entropy and shannon
> entropy be compared yet alone connected?
> 
> I am talking about information not entropy - an organized collection
> of organic chemicals must have more meaningful info than an
> unorganized collection of the same chemicals.
> 
> 
> On 11-Oct-07, at 5:34 PM, Loet Leydesdorff wrote:
> 
>>> Loet - if your claim is true then how do you explain that a random
>>> soup of
>>> organic chemicals have more Shannon info than an equal number of
>>> organic
>>> chemicals organized as a living cell where knowledge of some
>>> chemicals
>>> automatically implies the presence of others and hence have less
>>> surprise
>>> than those of the  soup of random organic chemicals? -  Bob
>> 
>> Dear Bob and colleagues,
>> 
>> In the case of the random soup of organic chemicals, the maximum
>> entropy of the systems is set by the number of chemicals involved (N).
>> The maximum entropy is therefore log(N). (Because of the randomness of
>> the soup , the Shannon entropy will not be much lower.)
>> 
>> If a grouping variable with M categories is added the maximum entropy
>> is log(N * M). Ceteris paribus, the redundancy in the system increases
>> and the Shannon entropy can be expected to decrease.
>> 
>> In class, I sometimes use the example of comparing Calcutta with New
>> York in terms of sustainability. Both have a similar number of
>> inhabitants, but the organization of New York is more complex to the
>> extent that the value of the grouping variables (the systems of
>> communication) becomes more important than the grouped variable (N).
>> When M is extended to M+1, N possibilities are added.
>> 
>> I hope that this is convincing or provoking your next reaction.
>> 
>> Best wishes,
>> 
>> 
>> Loet
> 
> ___
> fis mailing list
> fis@listas.unizar.es
> http://webmail.unizar.es/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-12 Thread bob logan
hi stan - I think we need to agree to disagree - for me final cause  
involves purpose or intention. A gas that expands into a larger  
volume has no intention or purpose and therefore one cannot talk  
about final cause because there is no agent - I did enjoy your  
argument but I just cannot buy it - Bob



On 12-Oct-07, at 3:01 PM, bob logan wrote:

Loet et al - I guess I am not convinced that information and  
entropy are connected. Entropy in physics has the dimension of  
energy divided by temperature. Shannon entropy has no physical  
dimension - it is missing the Boltzman constant. Therefore how can  
entropy and shannon entropy be compared yet alone connected?


I am talking about information not entropy - an organized  
collection of organic chemicals must have more meaningful info than  
an unorganized collection of the same chemicals.



On 11-Oct-07, at 5:34 PM, Loet Leydesdorff wrote:

Loet - if your claim is true then how do you explain that a  
random soup of
organic chemicals have more Shannon info than an equal number of  
organic
chemicals organized as a living cell where knowledge of some  
chemicals
automatically implies the presence of others and hence have less  
surprise

than those of the  soup of random organic chemicals? -  Bob


Dear Bob and colleagues,

In the case of the random soup of organic chemicals, the maximum
entropy of the systems is set by the number of chemicals involved  
(N).
The maximum entropy is therefore log(N). (Because of the  
randomness of

the soup , the Shannon entropy will not be much lower.)

If a grouping variable with M categories is added the maximum entropy
is log(N * M). Ceteris paribus, the redundancy in the system  
increases

and the Shannon entropy can be expected to decrease.

In class, I sometimes use the example of comparing Calcutta with New
York in terms of sustainability. Both have a similar number of
inhabitants, but the organization of New York is more complex to the
extent that the value of the grouping variables (the systems of
communication) becomes more important than the grouped variable (N).
When M is extended to M+1, N possibilities are added.

I hope that this is convincing or provoking your next reaction.

Best wishes,


Loet


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-12 Thread bob logan
Loet et al - I guess I am not convinced that information and entropy  
are connected. Entropy in physics has the dimension of energy divided  
by temperature. Shannon entropy has no physical dimension - it is  
missing the Boltzman constant. Therefore how can entropy and shannon  
entropy be compared yet alone connected?


I am talking about information not entropy - an organized collection  
of organic chemicals must have more meaningful info than an  
unorganized collection of the same chemicals.



On 11-Oct-07, at 5:34 PM, Loet Leydesdorff wrote:

Loet - if your claim is true then how do you explain that a random  
soup of
organic chemicals have more Shannon info than an equal number of  
organic
chemicals organized as a living cell where knowledge of some  
chemicals
automatically implies the presence of others and hence have less  
surprise

than those of the  soup of random organic chemicals? -  Bob


Dear Bob and colleagues,

In the case of the random soup of organic chemicals, the maximum
entropy of the systems is set by the number of chemicals involved (N).
The maximum entropy is therefore log(N). (Because of the randomness of
the soup , the Shannon entropy will not be much lower.)

If a grouping variable with M categories is added the maximum entropy
is log(N * M). Ceteris paribus, the redundancy in the system increases
and the Shannon entropy can be expected to decrease.

In class, I sometimes use the example of comparing Calcutta with New
York in terms of sustainability. Both have a similar number of
inhabitants, but the organization of New York is more complex to the
extent that the value of the grouping variables (the systems of
communication) becomes more important than the grouped variable (N).
When M is extended to M+1, N possibilities are added.

I hope that this is convincing or provoking your next reaction.

Best wishes,


Loet


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-12 Thread Stanley N. Salthe
Bob --

>Hi Stan - once again I enjoyed your remarks amplifying your original
>comment. I would like to add that science only deals with formal cause and
>not final cause. Final cause is for philosophers, social critics and
>theologians.

 S: Well, Bob, I will disagree with this.  I think finalities have been
subrosa AS such.  It seems to me that variational principles erect
finalities, in two senses.
(1) Where some value must increase or decrease with any transaction, as
with increase in entropy, or positive entropy production, in accordance
with the Second Law of thermodynamics, or, in evolutionary biology,
increase in fitness in Fisher's fundamental theorem of natural selection.
(I will leave out certain interpretations of QM, because I don't understand
them well enough.)
  To see in a stonger sense why the Second Law is finalistic, we place
it in the context of the Big Bang.  This ongoing event has created a
thermodynamically nonequilibrium world (for this view we need the plausible
supposition that the universe is an isolated system). Its tendency to gain
equilibrium is the Second Law.  This allows us to view the spontaneous
dissipation of energy gradients as finalistic.  Next, in order to see how
dissipative structures, including the living, are involved in this
finality, we note the critical fact of the poor energy efficiency of
natural work with effective work loads -- somewhere around 50%.  That is,
work serving the creation and maintenance of such systems is about equally
in the service of the Second Law.  Thus, as I type this message, I try to
communicate, but I am also contributing my share to Universal
equilibration.  Thus, two facts: the manifest disequilibrium partout, and
the poor energy efficiency of work, allow us to find finality in the Second
Law..

(2) Where some equation descriptive of a system cannot be solved unless a
parameter is minimized or maximized.  Here the finality is erected by the
observer, but the observed could not be known without it.  It is therefore
only known by way of a portal through final cause, which in this case might
be said to be virtual in Nature as known through science. It needs to be
realized that Nature is known ONLY through its observers.  That is to say,
the observer's properties mingle with Nature's.

STAN






___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-11 Thread Loet Leydesdorff
> Loet - if your claim is true then how do you explain that a random soup of
> organic chemicals have more Shannon info than an equal number of organic
> chemicals organized as a living cell where knowledge of some chemicals
> automatically implies the presence of others and hence have less surprise
> than those of the  soup of random organic chemicals? -  Bob

Dear Bob and colleagues,

In the case of the random soup of organic chemicals, the maximum
entropy of the systems is set by the number of chemicals involved (N).
The maximum entropy is therefore log(N). (Because of the randomness of
the soup , the Shannon entropy will not be much lower.)

If a grouping variable with M categories is added the maximum entropy
is log(N * M). Ceteris paribus, the redundancy in the system increases
and the Shannon entropy can be expected to decrease.

In class, I sometimes use the example of comparing Calcutta with New
York in terms of sustainability. Both have a similar number of
inhabitants, but the organization of New York is more complex to the
extent that the value of the grouping variables (the systems of
communication) becomes more important than the grouped variable (N).
When M is extended to M+1, N possibilities are added.

I hope that this is convincing or provoking your next reaction.

Best wishes,


Loet
___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-11 Thread bob logan
Dear Colleagues - I have combined my responses to a number of  
comments (from Ted, Stan and Loet) in one email. My comments are in  
red and the comments of the others that I am commenting on are in blue.


Hi Ted - I did not understand why you clain Kauffman's approach is  
algrebraic.


At the highest level, I think, there's a decision each of us has to  
make about the nature of the abstraction space we want to work in.  
Kauffman is famously on record as believing that the preferred space  
is algebraic. He goes further in stating that it is the ONLY space,  
all others illusory, or less fundamental. Without burdening you with  
that rather indefensible weight, its clear to me that what you have  
presented is clearly in this camp.



Hi Stan - once again I enjoyed your remarks amplifying your original  
comment. I would like to add that science only deals with formal  
cause and not final cause. Final cause is for philosophers, social  
critics and theologians.


Bob said:



Hi Stan - interesting ideas - I resonate with the thought that the
meaning of info is associated with  Aritostle's final cause -  
cheers Bob




Here I follow up with an extract from a text I am working on at present,
just to amplify this a bit more:

 Finally, what is the justification for considering meaning  
generally to be
associated to finality?  Why not formal causes as well?  Final cause  
is the
'why' of events, while formal causes carry the 'how' and 'where',  
material

causes the local 'readiness', and efficient cause the 'when' (Salthe,
2005).  It seems clear when choosing among these, that the meaning
(significance, purport, import, aim -- Webster's New Collegiate  
Dictionary)

of an event must be assigned to its final causes.  Formal and material
causes are merely enabling, while efficient cause only forces or  
triggers.
The example of a New Yorker cartoon captures some of my meaning  
here.  Two
Aliens are standing outside of their spaceship, which has apparently  
landed
on Earth, as we see spruce trees burning all around them in a fire  
that we
infer was triggered by their landing. One them says: "I know what  
caused it

-- there's oxygen on this planet".  If we think that is amusing, we know
implicitly why formality cannot carry the meaning of an event.  In  
natural
science formality has been used to model the structure of an  
investigated

system, and so is not suited to carrying its teleo tendencies as well.
Formality marks what will happen where, but not also 'why' it  
happens. The

causal approach itself is required if we are trying to extend semiosis
pansemiotically to nature in general. Natural science discourse is built
around causality, and so attempts to import meaning into it requires  
it to

be assimilated to causation.

Loet - if your claim is true then how do you explain that a random  
soup of organic chemicals have more Shannon info than an equal number  
of organic chemicals organized as a living cell where knowledge of  
some chemicals automatically implies the presence of others and hence  
have less surprise than those of the  soup of random organic  
chemicals? -  Bob



On 7-Oct-07, at 6:47 AM, Loet Leydesdorff wrote:


Dear Bob and colleagues,

Although I know that this comment was made in responding to another  
comment, let me react here because I think that this is not correct:
The point I am making is that organization is a form of  
information which Shannon theory does not recognize.


Shannon's theory is a mathematical theory which can be used in an  
application context (e.g., biology, electrical engineering) as a  
methodology. This has been called entropy statistics or, for  
example, statistical decomposition analysis (Theil, 1972). The  
strong methodology which it provides may enable us to answer  
theoretical questions in the field of application.


An organization at this level of abstraction can be considered as a  
network of relations and thus be represented as a matrix. (Network  
analysis operates on matrices.) A matrix can be considered as a two- 
dimensional probability distribution which contains an uncertainty.  
This uncertainty can be expressed in terms of bits of information.  
Similarly, for all the submatrices (e.g., components and cliques)  
or for any of the row or column vectors. Thus, one can recognize  
and study organization using Shannon entropy-measures.


The results, of course, have still to be appreciated in the  
substantive domain of application, but they can be informative to  
the extent of being counter-intuitive.


Best wishes,


Loet
Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
[EMAIL PROTECTED] ; http://www.leydesdorff.net/

Now available: The Knowledge-Based Economy: Modeled, Measured,  
Simulated. 385 pp.; US$ 18.95
The Self-Organization of the Knowledge-Based Society; The Challenge  
of Scientometrics





___

Re: [Fis] info & meaning

2007-10-10 Thread Pedro Marijuan

Dear FIS colleagues,

Thanks to all the discussants for the ideas & comments. Something else than 
meaning is at the stake in the current discussion. At fis, aren't we in a 
delicate juncture? After more than 10 years of discontinuous activities, 
but overall having cumulated a very respectable amount of knowledge & 
insights, we are still at the "isolation" phase --following the classical 
three stages in the emergence of new disciplines: 1. Isolation, merely a 
few scholars.  2. Identification, creating an organized nucleus and a first 
research center and starting some regular meetings, teaching & lectures, 
and succedding with a journal and some big books. 3. Instutionalization, 
receiving a lot of attention & PhD students, creating ad hoc departments in 
big univ., creation of professional societies, many journals & books, etc.


Actually, organizing Paris 2005 by Michel, to be continued by another 
meeting in the US, and the Info Institute idea launched by Dail, Mary Jo & 
Wolfgang appeared as two of the classical requisites to enter into phase 2. 
Plus a modest fis board that initiated some exchanges in spring-summer... 
But how are we faring in all these items? It is annoying that in spite that 
some of our contents are in phase 3, organizationally we are still in phase 1!


So, Steve and Ted and were quite right --it is "a call to arms" that we 
need. The FIS initiative is very atypical and maybe it will have to find 
its own atypical way too.  All FISers interested in moving ahead our 
problems should share reflections on possibilities and resources; right now 
the easiest way may be to try interleave both kind of contents 
---organization & discussion. At least, we can try for a while.


best wishes

Pedro

PS. In this case, given the centrality of the absence of Meaning in the 
classical theories of information, and given the oportunities open by the 
conceptual revolution of Sistems Biology that is transforming biology, 
neuroscience and medicine, plus the intense  "informationalization" of new 
ideas in physics (see "Programming the Universe" 2005 by Seth Lloyd, 
"Decoding the Universe" 2006 by Charles Seife, and "Information, the New 
Language of Science" 2004 by our FIS colleague Hans Christian von Baeyer), 
plus other robust trends in a variety of fields... the point is whether a 
common articulation of these info avantgarde ideas is feasible. Let me bet 
that "zero" is not far away.


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-08 Thread karl javorszky
Let me support Pedro's main theses:
Pedro: we live in a period comparable to that as Zero was evolved.
Yes, we do. The concept is unusual, expresses a feeling that we can share,
has no clear understanding yet, is in some ways revolutionary.
Pedro: The Zero as a concept lies behind all numbers. Its properties can be
figured out by thinking, because as a sensual experience it is not there.
Yes, we can recognise the relative background, even if there is no content
before the background.
Pedro:  There exists a central rule / idea / algorithm / logos / divine will
/ any_other_name that drives a.o. the cell. This rule / idea / etc. could
well be transcendent over all and everything that is living / existing
/spoken about / . This is what we want to find.
Yes, there appears to be a fundamental governing relation among the
holograms in our brains, which is interindividual. So we can talk about and
understand each other as we talk about the order among the concepts in my,
your, his, any other's head.

If there is a general order, then we can recognise it. Let us first look
into what we do in order to avoid recognition. The defence mechanisms of the
brain have been talked about much already. Here, they show up again, as we
strictly decline to revisit what we have been told at the tender age of 6
years.

There is a multitude to rules of thinking. Once learnt, they stay. Again,
Pedro hits the point with "after a few centuries" people have - after much
harrump and khrm - embraced the idea of Zero.

So we do the real science here, l'art pour l'art. In a few centuries, people
will say: "not bad, the idea." Maybe they will also say other remarks.

One more point: Zero is extremely practical to use. What a pity for those
few centuries between unearthing the idea and using it as a common reference
background (something to zero in on). That would be the average meaning of a
symbol, which is of course as useless as "a Zero" as such. If one does not
know the context of a Zero, it does indeed not shine much, as a Zero. So it
is with information, too. If one does not know its context, the information
is useless.

Anyway, nice discussion starting. Is there computing power available
anywhere among the esteemed colleagues?

Karl





2007/10/7, Loet Leydesdorff <[EMAIL PROTECTED]>:
>
>  Dear Bob and colleagues,
>
> Although I know that this comment was made in responding to another
> comment, let me react here because I think that this is not correct:
>
>  The point I am making is that organization is a form of information which
> Shannon theory does not recognize.
>
>
> Shannon's theory is a mathematical theory which can be used in an
> application context (e.g., biology, electrical engineering) as a
> methodology. This has been called entropy statistics or, for example,
> statistical decomposition analysis (Theil, 1972). The strong methodology
> which it provides may enable us to answer theoretical questions in the field
> of application.
>
> An organization at this level of abstraction can be considered as a
> network of relations and thus be represented as a matrix. (Network analysis
> operates on matrices.) A matrix can be considered as a two-dimensional
> probability distribution which contains an uncertainty. This uncertainty can
> be expressed in terms of bits of information. Similarly, for all the
> submatrices (e.g., components and cliques) or for any of the row or column
> vectors. Thus, one can recognize and study organization using Shannon
> entropy-measures.
>
> The results, of course, have still to be appreciated in the substantive
> domain of application, but they can be informative to the extent of being
> counter-intuitive.
>
> Best wishes,
>
>
> Loet
> --
>  Loet Leydesdorff
> Amsterdam School of Communications Research (ASCoR)
> Kloveniersburgwal 48, 1012 CX Amsterdam
> Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
> [EMAIL PROTECTED] ; http://www.leydesdorff.net/
>
> Now available: *The Knowledge-Based Economy: Modeled, Measured, 
> Simulated*.
> 385 pp.; US$ 18.95
> *The Self-Organization of the Knowledge-Based 
> Society*
> *; **The Challenge of 
> Scientometrics*
>
>
>
>
> ___
> fis mailing list
> fis@listas.unizar.es
> http://webmail.unizar.es/mailman/listinfo/fis
>
>
___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] Info & meaning

2007-10-07 Thread Stanley N. Salthe
Bob said:

>Hi Stan - interesting ideas - I resonate with the thought that the
>meaning of info is associated with  Aritostle's final cause - cheers Bob

Here I follow up with an extract from a text I am working on at present,
just to amplify this a bit more:

 Finally, what is the justification for considering meaning generally to be
associated to finality?  Why not formal causes as well?  Final cause is the
'why' of events, while formal causes carry the 'how' and 'where', material
causes the local 'readiness', and efficient cause the 'when' (Salthe,
2005).  It seems clear when choosing among these, that the meaning
(significance, purport, import, aim -- Webster's New Collegiate Dictionary)
of an event must be assigned to its final causes.  Formal and material
causes are merely enabling, while efficient cause only forces or triggers.
The example of a New Yorker cartoon captures some of my meaning here.  Two
Aliens are standing outside of their spaceship, which has apparently landed
on Earth, as we see spruce trees burning all around them in a fire that we
infer was triggered by their landing. One them says: "I know what caused it
-- there's oxygen on this planet".  If we think that is amusing, we know
implicitly why formality cannot carry the meaning of an event.  In natural
science formality has been used to model the structure of an investigated
system, and so is not suited to carrying its teleo tendencies as well.
Formality marks what will happen where, but not also 'why' it happens. The
causal approach itself is required if we are trying to extend semiosis
pansemiotically to nature in general. Natural science discourse is built
around causality, and so attempts to import meaning into it requires it to
be assimilated to causation.

Later, Bob asked:
>On Oct 2 Guy Hoelzer wrote:In my view meaning exists (or not) exclusively
>within systems.Ý It existsto the extent that inputs (incoming information)
>resonate within thestructure of the system.Ý The resonance can either
>reinforce the existingarchitecture (confirmation), destabilize it (e.g.,
>cognitivedisequilibrium), or construct new features of the architecture
>(e.g.,learning).I like this contribution and the comments made by Stan
>Salthe also on Oct 2Ý - they parallel Fredkins idea:Ý
>"The meaning of information is given by the processes that interpret it."Ý
>Would you agree Guy and Stan?
 S:  Yes, this is basic to semiotic approach.  It can, of course, at
the same time be context dependent.

Then Bob said:
>I agree it is only living and perhaps prebiotic things that have
>processes, namely propagating their organization and are therefore capable
>of interpreting information and hence according to Fredkin providing it
>with meaning. When a rock is acted upon by earth's gravity it does not
>have to interpret because it has no options it can only behave as
>causality demands.
 S: This statement is characterisically (not of Bob!) misleading. No
pansemiotician would suppose that a rock, per se, can be a system of
interpretance.  As with thermodynamics, meaning generation requires a
'correct' identification of the system being interrogated.  I am not
capable of saying here and now what the appropriate semiotic system would
be in the case of this rock, but taking the semiotic approach allows us to
search or it.

>Living things make choices - Bacteria decide to swim towards or away from
>a substance depending on their interpretation of whether it is food or
>toxin. The meaning of a glucose gradient to a bacteria is food, survival,
>I want it.
  S: In these cases the work of searching for the system was easy
enough. It is with abiotic systems, including (incidentally) species and
ecosystems (not themselves living), that the search for semiosis is more
difficult.

>Living things have agency whereas non-living things do not. As for
>pre-biotics I do not know enough biology or pre-biology to comment but
>obviously there is going to be a boundary between animate and inanimate
>matter and I do not know how sharp that boundary is.
 S: There are such boundaries, but we do not know if semiosis is one of
them.

Steven said
>My apparently simplistic proposal, that "meaning" refer to the
>behavior that is the product of a communication, should be seen in
>this context. It, in fact, applies at all scales. It may not be
>immediately apparent to you that it applies in the case of complex
>organisms like ourselves, but it does. The behavioral complex of our
>physiology produces a variety of small and potentially large
>behavioral changes on the receipt of information, for example in the
>complex assessment of what is benign and what is a threat, and in how
>to deal with information overload and how to deal with limited
>information.
  S: This matter of 'all scales' continues my above thought.  With
systems of scale larger than our observation scale, it is not easy to
identify the 'system' or to find the 'product of communication' in many
cases.

Concern

Re: [Fis] info & meaning

2007-10-07 Thread Steven Ericsson-Zenith
.


Thunderstorm noise present in the air has no meaning. But when you  
hear it, it can participate to meaning generation for you (and the  
meaning will be different if you are under a shelter or on the beach).


A moving fly seen by a hungry frog will generate a meaning for the  
frog (available food that can be eaten). The same fly, but not  
moving, will not generate any meaning for the same frog. The frog’s  
visual system is capable to interpret only moving flies, not flies  
at rest.


So information present around us can be meaningful (contain  
meanings, like newspapers do), or meaningless like thunderstorm  
noise. A system capable of receiving the information may or may not  
generate a meaning associated to the received information. It will  
depend upon the constraints that the system has to satisfy and upon  
the relations that these constraints entertain with the received  
information. The fact that the received information may already be  
meaningful brings to take into account another layer in meaning  
generation. Again, this is close to a simplified version of the  
Peircean triadic approach on sign.


This is what I mean by "Information and meaning exist around us".

More is available in http://www.mdpi.org/entropy/papers/e5020193.pdf

All the best

Christophe



-Message d'origine-
De : Steven Ericsson-Zenith [mailto:[EMAIL PROTECTED]
Envoyé : samedi 6 octobre 2007 11:08
À : Christophe MENANT
Objet : Re: TR: SV: [Fis] info & meaning



Dear Christophe,



(I'll go over my quota for the week if I post directly to the list,

Pedro allows 3 posts per week)



You refer to information and meaning "Information and meaning exist

around us". It seems possible also that I was also prompted to use

the phrase "abstract meaning" because your PDF abstract runs

"Abstract: Information and meaning exist around us ..." but it

doesn't matter if that was the case, the substance of my comments

still hold.



With respect,

Steven





--

Dr. Steven Ericsson-Zenith

Institute for Advanced Science & Engineering

http://iase.info

http://senses.info







On Oct 6, 2007, at 12:46 AM, Christophe MENANT wrote:



> Steven,

>

> The Meaning Generation System (1) is linked to basic semiotics and

> follows a pragmatic approach. It cannot be vague nor introduce

> ‘abstract meaning’. Could you tell where you have found this ?

>

> All the best

>

> Christophe.

>

> (1)http://cogprints.org/4532/

>

> -Message d'origine-

> De : Steven Ericsson-Zenith [mailto:[EMAIL PROTECTED]

> Envoyé : vendredi 5 octobre 2007 21:00

> À : Christophe MENANT

> Cc : fis@listas.unizar.es

> Objet : Re: TR: SV: [Fis] info & meaning

>

>

>

> Dear Christophe,

>

>

>

> I am not satisfied by this definition of "meaning." It is vague and

>

> uncertain. Your paper also introduces a notion of "abstract meaning"

>

> where I believe you are referring to "marks" or the latent potential

>

> of "meaning" (by my definition) in the world. So I think the problem

>

> is that there is not a sufficiently rigorous framework here.

>

>

>

> My interpretation of Pedro's call to arms was that he was indeed

>

> calling for a rigorous semeiotic science, which is what the

>

> "information science" he described necessarily becomes. I took his

>

> call to be one that appealed for foundational work, not merely a

>

> tidying up of convention.

>

>

>

> The definition of meaning that I am looking for, like Pedro, is one

>

> that can be applied with the rigor extends Shannon and applies to

>

> biophysics. This rigorous evolution is already underway in the

>

> community with the consideration of "algorithmic information

>

> theory" (Chaitin, Wolfram et al.). I don't share their optimism for

>

> emergence theory as the ultimate solution to all things but the

>

> development of the notion of algorithmic information itself is a

>

> useful and rigorous step forward.

>

>

>

> My apparently simplistic proposal, that "meaning" refer to the

>

> behavior that is the product of a communication, should be seen in

>

> this context. It, in fact, applies at all scales. It may not be

>

> immediately apparent to you that it applies in the case of complex

>

> organisms like ourselves, but it does. The behavioral complex of our

>

> physiology produces a variety of small and potentially large

>

> behavioral changes on the receipt of information, for example in the

>

> complex assessment of what is benign and what is a threat, and in  
how


>

> to deal with information overload and how to

RE: [Fis] info & meaning

2007-10-07 Thread Loet Leydesdorff
Dear Bob and colleagues, 
 
Although I know that this comment was made in responding to another comment,
let me react here because I think that this is not correct:

The point I am making is that organization is a form of information which
Shannon theory does not recognize.
 

Shannon's theory is a mathematical theory which can be used in an
application context (e.g., biology, electrical engineering) as a
methodology. This has been called entropy statistics or, for example,
statistical decomposition analysis (Theil, 1972). The strong methodology
which it provides may enable us to answer theoretical questions in the field
of application.
 
An organization at this level of abstraction can be considered as a network
of relations and thus be represented as a matrix. (Network analysis operates
on matrices.) A matrix can be considered as a two-dimensional probability
distribution which contains an uncertainty. This uncertainty can be
expressed in terms of bits of information. Similarly, for all the
submatrices (e.g., components and cliques) or for any of the row or column
vectors. Thus, one can recognize and study organization using Shannon
entropy-measures.
 
The results, of course, have still to be appreciated in the substantive
domain of application, but they can be informative to the extent of being
counter-intuitive.
 
Best wishes, 
 
 
Loet 
  _  

Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
  [EMAIL PROTECTED] ;
 http://www.leydesdorff.net/ 

 
Now available:

The Knowledge-Based Economy: Modeled, Measured, Simulated. 385 pp.; US$
18.95 
 
The Self-Organization of the Knowledge-Based Society;

The Challenge of Scientometrics

 
 
___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Fwd: [Fis] info & meaning

2007-10-06 Thread bob logan



Begin forwarded message:


From: bob logan <[EMAIL PROTECTED]>
Date: October 5, 2007 12:26:46 PM EDT (CA)
To: Pedro Marijuan <[EMAIL PROTECTED]>
Subject: Re: [Fis] info & meaning

Cher Colleagues - I am overwhelmed by all the responses in terms of  
there number and their depth. This has been a very busy time as a  
new school year just begun. I have just signed a contract for my  
new book Understanding New Media: Extending Marshall McLuhan which  
requires a manuscript by the end of Oct., my book  The Extended  
Mind: The Emergence of Langugae, the Human Mind and Culture is  
released on Oct. 22 and I have to plan a book launch and publicity  
for it and finally I hosted Katherine Hayles this week in Toronto  
where she spoke at the Ont College of Art and Design.  So this is  
my excuse and hope you will forgive my long silence.


So here goes my comments in red as many as I can make with more to  
follow. First of all let me share a wonderful quote of Claude  
Shannon that supports my hypothesis that information is a relative  
thing and is context dependent.


This paper appears in: Information Theory, IEEE Transactions on
Publication Date: Feb 1953
Volume: 1,  Issue: 1
On page(s): 105- 107
ISSN: 0018-9448
Posted online: 2003-04-22 14:03:12.0

Abstract

The word "information" has been given many different meanings by  
various writers in the general field of information theory. It is  
likely that at least a number of these will prove sufficiently  
useful in certain applications to deserve further study and  
permanent recognition. It is hardly to be expected that a single  
concept of information would satisfactorily account for the  
numerous possible applications of this general field. The present  
note outlines a new approach to information theory which is aimed  
specifically at the analysis of certain communication problems in  
which there exist a number of information sources simultaneously in  
operation. A typical example is that of a simple communication  
channel with a feedback path from the receiving point to the  
transmitting point. The problem is to make use of the feedback  
information for improving forward transmission, and to determine  
the forward channel capacity when the best possible use is made of  
this feedback information. Another more general problem is that of  
a communication system consisting of a large number of transmitting  
and receiving points with some type of interconnection network  
between the various points. The problem here is to formulate the  
best systems design whereby, in some sense, the best overall use of  
the available facilities is made. While the analysis sketched here  
has not yet proceeded to the point or a complete solution of these  
problems, partial answers have been found and it is believed that a  
complete solution may be possible.



On 24-Sep-07, at 11:56 AM, Pedro Marijuan wrote:


Dear Bob and colleagues,

Thanks for the scholarly work. It was nice reading the vast  
landscape, historical and otherwise, you have covered around that  
new biotic interpretation of information as "Propagating  
Organization" or POE. Perhaps in this list we have already arrived  
to a truce on Shannonian matters --notwithstanding a general  
agreement with your criticisms on that respect, the problem may be  
a lack of interesting alternative generalizations, inclusive  
enough so that the Shannonian theory gets its proper place as a  
great theory of communication and as a great theory of physical  
structures (in the guise of "physical information theory") and  
also as a great theory in data analysis.


I agree Shannon theory is powerful but I also agree with Shannon  
that its scope is limited.


Thus I am sympathetic with integrative quests such as POE, but  
given that disagreeements are usually the most valuable stuff for  
discussions, I would raise the following points:


 --1. Constraints and boundary conditions are conflated (see  
below), with some more emphasis on the former. Thus, taking into  
account that most chemical constraints are in the form of  
"activation energies" and "free energies" which establish the  
kinetics of the multitude of chemical reactions and molecular  
transformations in the cell, How this heterogeneous collection of  
variables and parameters may receive a form of global treatment as  
POE seems to imply? I have seen no hints in the paper.


 ...we defined a new form of information, which we called  
instructional or biotic information, not with Shannon, but with  
constraints or boundary conditions. The amount of information  
will be related to the diversity of constraints and the diversity  
of processes that they can partially cause to occur.


As a result of my conversations with Hayle I believe I need to put  
more emphsis on the processes that the constraints make possible.  
So I plan to think more about the relationship of bound

Re: TR: SV: [Fis] info & meaning

2007-10-06 Thread Steven Ericsson-Zenith

Dear Christophe,

I am not satisfied by this definition of "meaning." It is vague and  
uncertain. Your paper also introduces a notion of "abstract meaning"  
where I believe you are referring to "marks" or the latent potential  
of "meaning" (by my definition) in the world. So I think the problem  
is that there is not a sufficiently rigorous framework here.


My interpretation of Pedro's call to arms was that he was indeed  
calling for a rigorous semeiotic science, which is what the  
"information science" he described necessarily becomes. I took his  
call to be one that appealed for foundational work, not merely a  
tidying up of convention.


The definition of meaning that I am looking for, like Pedro, is one  
that can be applied with the rigor extends Shannon and applies to  
biophysics. This rigorous evolution is already underway in the  
community with the consideration of "algorithmic information  
theory" (Chaitin, Wolfram et al.). I don't share their optimism for  
emergence theory as the ultimate solution to all things but the  
development of the notion of algorithmic information itself is a  
useful and rigorous step forward.


My apparently simplistic proposal, that "meaning" refer to the  
behavior that is the product of a communication, should be seen in  
this context. It, in fact, applies at all scales. It may not be  
immediately apparent to you that it applies in the case of complex  
organisms like ourselves, but it does. The behavioral complex of our  
physiology produces a variety of small and potentially large  
behavioral changes on the receipt of information, for example in the  
complex assessment of what is benign and what is a threat, and in how  
to deal with information overload and how to deal with limited  
information.


For me then, marks contain potential information. One might say they  
possess "latent meaning." They produce information in their  
apprehension that adds to knowledge and produces behavior (recall, my  
definition of "knowledge" is generalized to "that which determines  
subsequent action").


So then we can now speak specifically about what a "meaning" is. This  
definition works perfectly well if you are referring to the meaning  
of any syntactic entity, be it a computer program interpreted by a  
machine or an informal communication between us. It works perfectly  
well if you apply it to logical syntax exchanged between us. It works  
perfectly well if we apply it to a computer language, mathematical  
logic or works of art.


If you were identical to me in all important respects then you would  
understand exactly what I have said here and it would produce in you  
behavior exactly like that in me. But we are not identical and  
therein lies the variance.


So the questions to resolve are

	1. Is the sender sending a message that is complete? That is, does  
the message contain all the information to reproduce the sender  
behavior in an identical receiver? This is the "meaning" in the  
sender. More precisely, the "meaning" in the sender is exactly the  
behavior produced by the mark that is the "message" in the sender and  
no other.

2. Signal to noise ratio; successful transmission of the mark.
	3. The behavior the message produces in the receiver; the "meaning"  
in the receiver. As noted before, if sender and receiver are  
identical and the message is complete and clear it will produce a  
determined behavior - the same behavior as found in the sender.


There are, of course, differences between you and I (including  
culture and educational background) so I cannot expect that this note  
produces in you what it produces in me (and the old adage of  
information theory and computer science applies: garbage in, garbage  
out).


The fact that the term "meaning" is overloaded in conventional  
language and not rigorously used should not deter us from the  
clarification of the concept here. When someone says "What do you  
mean?" they are really asking "How does this information change how  
you behave?" Where behavior covers all levels of process within the  
organism.


I think Soren and I may well be in agreement up to this point. Where  
we disagree is on the mechanics involved, and especially concerning  
the mechanics of sentience. At this level of definition however this  
is unimportant. I suggest that the nature of "consciousness" is only  
relevant if some aspect of it plays a role in these mechanics (as it  
does in my model and not in Soren's).


With respect,
Steven

--
Dr. Steven Ericsson-Zenith
Institute for Advanced Science & Engineering
http://iase.info
http://senses.info



On Oct 5, 2007, at 3:56 AM, Christophe MENANT wrote:


Steven,

In a few words, what I understand by “meaning”.

1) We all agree that the Shannon theory of information addresses  
the capacity of transmission of a communication channel. It does  
not deal at all with the possible meaning associated with the  
information. A different approach i

Re: [Fis] Info & meaning

2007-10-06 Thread bob logan
Hi Stan - interesting ideas - I resonate with the thought that the  
meaning of info is associated with  Aritostle's final cause - cheers Bob



On 25-Sep-07, at 4:39 PM, Stanley N. Salthe wrote:


First I comment on Pedro's:


The "information overload" theme (in the evolution of social modes of
communication), is really intriguing. >Should we take it, say, in its
prima facie? I am inclined to put it into question, at least to  
have an

excuse and try to >unturn that pretty stone...


The question of information overload connects with my theory of  
senescence

(1993 book Development & Evolution: Complexity and Change in Biology),
which is that it results from the continual taking in of  
information after
a system has become definitive (all material systems necessarily  
continue
to get marked), while at the same time there is less loss of  
information

(matter is 'sticky').  The result is that a system becomes slower to
respond to perturbations, atr least because lag times increase as a  
result
of a multiplication of channels (increased informational friction),  
and

because some channels effect responses at cross purposes.


and then
Walter's:

(2) In the same way: how can arise the 'meaning' In naturalist  
terms or

imposed by us?


I have recently concluded that meaning can be naturalized by  
aligning it

with finality in the Aristotelian causal analysis (material/formal,
efficient/final).  I assert that final cause has not really been  
eliminated

from physics and other sciences, but has been embodied in variatinal
principles like the Second Law of thermodynamics, and in evolutionary
theory - in Fisher's fundamental theorem of natural selection, as  
well as

in some interpetations of the 'collapse of the wave function' in QM.

STAN


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: FW: SV: [Fis] info & meaning

2007-10-06 Thread Stanley N. Salthe
Reacting to Christophe's statement:

>But I'm afraid I disagree with your point regarding first person
>consciousness as not representing anything real, >as just being a
>bio-cultural artefact as you say. I take human consciousness as being a
>reality resulting from an >evolution of representations. But this is not
>our today subject.

S: Ths was actually the subject of a past fis discussion, on 'Internalism'.
For those who may not recall it, internalism is an emerging perspective in
science (e.g. endophysics) and was preceded by viewpoints like the famous
'autopoiesis'.  Basically it is the attempt to model a system as if from
the inside, using first person, present progressive modes of description.
Cosmology ought to be internalist, but cosmologists have contrived to talk
about the universe they are viewing from within AS IF they were seeing it
from outside.

Later Christophe said:

>With this background, we can consider that a meaningful information (a
>meaning) does not exist per se but comes >from a system submitted to a
>constraint that has generated the meaning in order to satisfy the
>constraint. (stay >alive for an organism, valorize ego for a human  &). A
>meaning can be defined only when a system submitted to a >constraint is in
>relation with its environment.
 S: This invokes Peircean semiotics, which is triadic, instead of the
erstwhile dyadic discourse of science.  That is, interpretation is inserted
between input and output, and it is a 'system of interpretance' that
accomplished this, which Loet refers to in
>The expression of Bateson "A difference which makes a  difference"
>presumes that there is a system or a series of >events for which the
>differences can make a difference.

Then Søren says:
>First person meaningful consciousness is
>a bio-cultural artifact useful for the construction of life and culture, but
>it is not an image of anything real.
  S: This is interesting in regard to the above.  It is basically a
statement denying the possibility of 'pansemiosis' -- that is, that a
semiotic reworking of all of science may be a possibility, or, that all of
nature is characterized by semiosis.  I would take 'First person meaningful
consciousness' to be a highly evolved phenomenon based on a more primitive
semiosis in nature.  Thus: {vague proto-consciousness {First person
meaningful consciousness}} is showing the evolutionary relationship.

STAN







___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: SV: [Fis] info & meaning

2007-10-06 Thread Robin Faichney




Friday, October 5, 2007, 1:53:51 PM, Loet wrote:





>


Dear colleagues,
 
I agree with a lot of Christophe Menant's last mail, but I think that I can take it a step further. 
 
The _expression_ of Bateson "A difference which makes a difference" presumes that there is a system or a series of events for which the differences can make a difference. This system selects upon the differences (or Shannon-type information) in the environment of the system. The Shannon-type information is meaningless, but the specification of the system of reference provides the information with meaning. The Shannon-type information which is deselected is discarded as noise. 





That's (at least approximately) what I mean when I say that intentional information is always encoded in physical information. Intentional information is the ordinary concept of information and is meaningful. Physical information is very closely related to Shannon information and has no intrinsic meaning, being mere physical patterns -- on this conceptualisation, which is widely accepted within physics, all physical patterns are treated as Shannon-type information. Intentional or semantic information, on the other hand, requires a context, which plays the part of a decoding key. Thus semantic information, or meaning, is always encoded within physical patterns.




>


Meaning is provided to the information from the perspective of hindsight.





I don't think "hindsight" is strictly correct, because it implies a conscious "looking back", whereas the processing of meaning (decoding) often occurs prior to consciousness.




>


The meaningful information, however, still follows the arrow of time. Meaning processing within psychological and social systems reinforces the feedback arrow (from the hindsight perspective) to the extent that control tends to move to this next-order level. The system can then become anticipatory because the information which is provided with meaning can be entertained by the system as a model. Perhaps, human language is required for making that last step: no longer is only information exchanged, but information is packaged into messages in which the information has a codified meaning.





Modelling is certainly what allows anticipation, but some modelling, at least, does not require language: consider catching a ball that's thrown to you. You model the trajectory, I would suggest, in order to put your hand in the right place at the right time, but language is obviously not involved there. Of course you might say that meaning plays no part in that scenario, but I think it's a very big mistake to deny a continuum from significance of any sort at one extreme to the highly abstract and sophisticated meanings of the messages on this list, at the other. What both extremes have in common is the concept of use, as in Wittgenstein's later view of meaning: it is our use, I would suggest, of physical patterns, that encodes significance and meaning within them, and the modelling of a trajectory has significant similarities with the modelling of correspondents and their intentions (though significant differences too, of course).

-- 
Robin Faichney




___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-06 Thread bob logan
Karl et al - I agree there is lots of value in past FIS discussions  
of info but as Shannon himself  opined


"It is hardly to be expected that a single concept of information  
would satisfactorily account for the numerous possible applications  
of this general field."


I only offered one such concept namely instructional or biotic  
information. And I agree with Karl:  Now the

task is to figure out how to manage the consequences.



On 25-Sep-07, at 9:54 AM, karl javorszky wrote:



There is more well-prepared work in FIS than meets the eye. Let us not
restart from inventing the idea of information. This is well done.  
Now the

task is to figure out how to manage the consequences.

All the best:
Karl
Karl



___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


RE: SV: [Fis] info & meaning

2007-10-05 Thread Loet Leydesdorff
Dear colleagues,
 
I agree with a lot of Christophe Menant's last mail, but I think that I can
take it a step further. 
 
The expression of Bateson "A difference which makes a difference" presumes
that there is a system or a series of events for which the differences can
make a difference. This system selects upon the differences (or Shannon-type
information) in the environment of the system. The Shannon-type information
is meaningless, but the specification of the system of reference provides
the information with meaning. The Shannon-type information which is
deselected is discarded as noise. 
 
Meaning is provided to the information from the perspective of hindsight.
The meaningful information, however, still follows the arrow of time.
Meaning processing within psychological and social systems reinforces the
feedback arrow (from the hindsight perspective) to the extent that control
tends to move to this next-order level. The system can then become
anticipatory because the information which is provided with meaning can be
entertained by the system as a model. Perhaps, human language is required
for making that last step: no longer is only information exchanged, but
information is packaged into messages in which the information has a
codified meaning.
 
Unfortunately, this email system does not allow to draw a picture. Maturana
in his 2000-paper about "Nature and the Laws of Nature" nicely wrote about
the orthogonal axis which the next-order system develops upon the stream
from which it originates. (Shannon-type) information processing of
differences would then be the stream. These differences can make a
difference when they pass the interface with the emerging system. If the
latter can develop a feedback mechanism, it can provide meaning to the
information. The "difference that makes a difference" then becomes the
mutual information between the information processing and the
meaning-processing. 
 
It needs further elaboration. 
 
With best wishes, 
 
 
Loet
 
  _  

Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR), 
Kloveniersburgwal 48, 1012 CX Amsterdam. 
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
  [EMAIL PROTECTED] ;
 http://www.leydesdorff.net/ 

 
___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


TR: SV: [Fis] info & meaning

2007-10-05 Thread Christophe MENANT
Steven, 

In a few words, what I understand by “meaning”. 

1) We all agree that the Shannon theory of information addresses the
capacity of transmission of a communication channel. It does not deal at all
with the possible meaning associated with the information. A different
approach is needed.

2) The notion of meaning associated to information is a complex subject as
it covers a wide area of different applications (focusing on meaning
associated to human language may be misleading as it is one of the most
complex cases). 

First clarification is to define different domains of complexity. Gross
sizing: matter, life, human. 
Then, put aside for a while the case of matter and focus on life and human
in the context of a pragmatic approach.  

With this background, we can consider that a meaningful information (a
meaning) does not exist per se but comes from a system submitted to a
constraint that has generated the meaning in order to satisfy the
constraint. (stay alive for an organism, valorize ego for a human …). A
meaning can be defined only when a system submitted to a constraint is in
relation with its environment.

The environment of the system makes available a lot of information that the
system can receive. Only the received information having a connection with
the constraint of the system will generate a meaning within the system. And
we can consider that the content of the meaning is precisely that connection
existing between the received information and the constraint of the system.
A systemic approach can be formalized on this subject with the introduction
of a Meaning Generator System (MGS). See short paper
http://crmenant.free.fr/ResUK/index.HTM

3) As you may have noted, such approach on meaning generation is triadic and
can be part of the neighborhood of the Peircean  theory of sign (in a much
simpler and less elaborated form). 

The MGS is also in the domain of the Von Uexkull biosemiotics where a
meaning is generated by the connection of the constraints of the organism
(Internal world of the organism, Umwelt) with the external world.

4) Going from simple organisms to humans in the field of meaning generation
is not an easy task. The constraints to satisfy cumulate and are more and
more elaborated. The systems also become more complex and are inter-related.
And the mysterious function of human consciousness comes in. 

However, looking at MGS as a building block can offers some possibilities
(see  http://cogprints.org/4531/ ).

All the best

Christophe

-Message d'origine-
De : Steven Ericsson-Zenith [mailto:[EMAIL PROTECTED] 
Envoyé : vendredi 5 octobre 2007 01:26
À : Christophe Menant
Cc : fis@listas.unizar.es
Objet : Re: SV: [Fis] info & meaning

 

 

I read Pedro's post differently. What definition of meaning are you  

using exactly?

 

I was going to express agreement with Pedro too, but I do not agree  

with either Soren of Christophe's interpretation of Pedro's posting.  

Can Pedro clarify? And can we be more precise in what we mean when we  

use the term "meaning?"

 

With respect,

Steven

 

 

--

Dr. Steven Ericsson-Zenith

Institute for Advanced Science & Engineering

http://iase.info

http://senses.info

 

 

On Oct 4, 2007, at 3:38 PM, Christophe Menant wrote:

 

> Dear Soren,

> I agree with your reading of Pedro’s  proposal as to start with  

> cellular meaning, and then go thru the higher levels of evolution.  

> It has the advantage of beginning with the simplest case and then  

> look at more complex ones. See (1) for a corresponding approach.

> But I’m afraid I disagree with your point regarding first person  

> consciousness as not representing anything real, as just being a  

> bio-cultural artefact as you say. I take human consciousness as  

> being a reality resulting from an evolution of representations. But  

> this is not our today subject.

> Coming back to it, Walter Riofrio, (New FIS member) has an  

> interesting approach to the notion of meaning where he groups  

> together the emergence of autonomy, function and meaning (2). I  

> understand his work as associating inside a system a meaningful  

> information with a function that needs it in order to use it, in a  

> background of autonomy. Such evolutionary link between meaningful  

> information and function looks as an interesting tool.

> 

> All the best

> Christophe

> (1) - Short paper: http://crmenant.free.fr/ResUK/index.HTM

>   - Full paper: http://www.mdpi.org/entropy/papers/e5020193.pdf

> (2) http://hal.archives-ouvertes.fr/hal-00114521/en/

> 

> 

> 

> > Date: Thu, 4 Oct 2007 22:13:27 +0200

> > From: [EMAIL PROTECTED]

> > Subject: SV: [Fis] info & meaning

> > To: [EMAIL PROTECTED]; fis@listas.unizar.es

> >

> > Dear Pedro

> >

> > Do I understand you right when I see your models a

Re: SV: [Fis] info & meaning

2007-10-04 Thread Steven Ericsson-Zenith


I read Pedro's post differently. What definition of meaning are you  
using exactly?


I was going to express agreement with Pedro too, but I do not agree  
with either Soren of Christophe's interpretation of Pedro's posting.  
Can Pedro clarify? And can we be more precise in what we mean when we  
use the term "meaning?"


With respect,
Steven


--
Dr. Steven Ericsson-Zenith
Institute for Advanced Science & Engineering
http://iase.info
http://senses.info


On Oct 4, 2007, at 3:38 PM, Christophe Menant wrote:


Dear Soren,
I agree with your reading of Pedro’s  proposal as to start with  
cellular meaning, and then go thru the higher levels of evolution.  
It has the advantage of beginning with the simplest case and then  
look at more complex ones. See (1) for a corresponding approach.
But I’m afraid I disagree with your point regarding first person  
consciousness as not representing anything real, as just being a  
bio-cultural artefact as you say. I take human consciousness as  
being a reality resulting from an evolution of representations. But  
this is not our today subject.
Coming back to it, Walter Riofrio, (New FIS member) has an  
interesting approach to the notion of meaning where he groups  
together the emergence of autonomy, function and meaning (2). I  
understand his work as associating inside a system a meaningful  
information with a function that needs it in order to use it, in a  
background of autonomy. Such evolutionary link between meaningful  
information and function looks as an interesting tool.


All the best
Christophe
(1) - Short paper: http://crmenant.free.fr/ResUK/index.HTM
  - Full paper: http://www.mdpi.org/entropy/papers/e5020193.pdf
(2) http://hal.archives-ouvertes.fr/hal-00114521/en/



> Date: Thu, 4 Oct 2007 22:13:27 +0200
> From: [EMAIL PROTECTED]
> Subject: SV: [Fis] info & meaning
> To: [EMAIL PROTECTED]; fis@listas.unizar.es
>
> Dear Pedro
>
> Do I understand you right when I see your models as:
>
> 1. There is no meaning in inanimate nature.
> 2. Meaning is constructed on a first level by life in the form of  
single

> cell life forms.
> 3. Second level is (chemical) communication between cells.
> 4. Third level is multicellular organisms as species with a gene  
pool.

> 5. Fifth level is their communication.
> 6. Sixth level human construction of meaning in 'life worlds'.
>
> But there is no object of meaning in itself. Energy and mathematical
> information are the basic reality. First person meaningful  
consciousness is
> a bio-cultural artifact useful for the construction of life and  
culture, but

> it is not an image of anything real.
>
> Best wishes
>
> Søren
>
>
> -Oprindelig meddelelse-
> Fra: [EMAIL PROTECTED] [mailto:fis- 
[EMAIL PROTECTED] På

> vegne af Pedro Marijuan
> Sendt: 4. oktober 2007 14:23
> Til: fis@listas.unizar.es
> Emne: Re: [Fis] info & meaning
>
> Dear colleagues,
>
> What if meaning is equivalent to "zero"?
>
> I mean, if we backtrack to the origins of zero, we find those  
obscure
> philosophers related to Buddhism in India, many centuries ago  
(Brahmagupta,
> 600 ad). It was something difficult to grasp, rather bizarre, the  
fruit of
> quite a long and winding thought, and frankly not of much  
practicity. Then
> after not many developments during a few centuries, another  
scholar in
> central Asia (al-Kwarismi) took the idea and was able to  
algorithmize the
> basic arithmetic operations. Mathematics could fly... and  
nowadays any

> school children learns and uses arithmetics & algebra so easily.
>
> The idea is that if we strictly identify (we "zero" on) meaning as a
> biological construct, work it rigorously for the living cell as a  
tough

> problem of systems biology (and not as a flamboyant autopoiectic or
> autogenic or selftranscence doctrines of Brahmaguptian style),  
then we work
> for a parallel enactive action/perception approach in  
neuroscience, and

> besides pen a rigorous view in social-economic setting under similar
> guidelines --and also find the commonalities with quantum  
computing and

> information physics... finally information science will fly.
>
> Otherwise, if we remain working towards the other direction, the
> undergrounds of zero downwards, we will get confined into bizarre,
> voluminous, useless discussions & doctrines on information.  
Cellular meaning

> is our zero concept: we should go for it.
>
> best
>
> Pedro
>
>
>
>
>
> ___
> fis mailing list
> fis@listas.unizar.es
> http://webmail.unizar.es/mailman/listinfo/fis
>
>
> Internal Virus Database is out-of-date.
> Checked by AVG Free Edition.
> Version: 7.5.488 / Virus D

FW: SV: [Fis] info & meaning

2007-10-04 Thread Christophe Menant

Dear Soren,  I agree with your reading of Pedro’s  proposal as to start with 
cellular meaning, and then go thru the higher levels of evolution. It has the 
advantage of beginning with the simplest case and then look at more complex 
ones. See (1) for a corresponding approach.But I’m afraid I disagree with your 
point regarding first person consciousness as not representing anything real, 
as just being a bio-cultural artefact as you say. I take human consciousness as 
being a reality resulting from an evolution of representations. But this is not 
our today subject. Coming back to it, Walter Riofrio, (New FIS member) has an 
interesting approach to the notion of meaning where he groups together the 
emergence of autonomy, function and meaning (2). I understand his work as 
associating inside a system a meaningful information with a function that needs 
it in order to use it, in a background of autonomy. Such evolutionary link 
between meaningful information and function looks as an interesting tool.  
All the best Christophe(1) - Short paper: 
http://crmenant.free.fr/ResUK/index.HTM  - Full paper: 
http://www.mdpi.org/entropy/papers/e5020193.pdf(2) 
http://hal.archives-ouvertes.fr/hal-00114521/en/> Date: Thu, 4 Oct 2007 
22:13:27 +0200> From: [EMAIL PROTECTED]> Subject: SV: [Fis] info & meaning> To: 
[EMAIL PROTECTED]; fis@listas.unizar.es> > Dear Pedro> > Do I understand you 
right when I see your models as:> > 1. There is no meaning in inanimate 
nature.> 2. Meaning is constructed on a first level by life in the form of 
single> cell life forms.> 3. Second level is (chemical) communication between 
cells.> 4. Third level is multicellular organisms as species with a gene pool.> 
5. Fifth level is their communication.> 6. Sixth level human construction of 
meaning in 'life worlds'. > > But there is no object of meaning in itself. 
Energy and mathematical> information are the basic reality. First person 
meaningful consciousness is> a bio-cultural artifact useful for the 
construction of life and culture, but> it is not an image of anything real.> > 
Best wishes> > Søren> > > -Oprindelig meddelelse-> Fra: [EMAIL 
PROTECTED] [mailto:[EMAIL PROTECTED] På> vegne af Pedro Marijuan> Sendt: 4. 
oktober 2007 14:23> Til: fis@listas.unizar.es> Emne: Re: [Fis] info & meaning> 
> Dear colleagues,> > What if meaning is equivalent to "zero"?> > I mean, if we 
backtrack to the origins of zero, we find those obscure> philosophers related 
to Buddhism in India, many centuries ago (Brahmagupta,> 600 ad). It was 
something difficult to grasp, rather bizarre, the fruit of> quite a long and 
winding thought, and frankly not of much practicity. Then> after not many 
developments during a few centuries, another scholar in> central Asia 
(al-Kwarismi) took the idea and was able to algorithmize the> basic arithmetic 
operations. Mathematics could fly... and nowadays any> school children learns 
and uses arithmetics & algebra so easily.> > The idea is that if we strictly 
identify (we "zero" on) meaning as a> biological construct, work it rigorously 
for the living cell as a tough> problem of systems biology (and not as a 
flamboyant autopoiectic or> autogenic or selftranscence doctrines of 
Brahmaguptian style), then we work> for a parallel enactive action/perception 
approach in neuroscience, and> besides pen a rigorous view in social-economic 
setting under similar> guidelines --and also find the commonalities with 
quantum computing and> information physics... finally information science will 
fly.> > Otherwise, if we remain working towards the other direction, the> 
undergrounds of zero downwards, we will get confined into bizarre,> voluminous, 
useless discussions & doctrines on information. Cellular meaning> is our zero 
concept: we should go for it.> > best> > Pedro> > > > > > 
___> fis mailing list> 
fis@listas.unizar.es> http://webmail.unizar.es/mailman/listinfo/fis> > > 
Internal Virus Database is out-of-date.> Checked by AVG Free Edition. > 
Version: 7.5.488 / Virus Database: 269.13.30/1025 - Release Date: 23-09-2007> 
13:53> > > Internal Virus Database is out-of-date.> Checked by AVG Free 
Edition. > Version: 7.5.488 / Virus Database: 269.13.30/1025 - Release Date: 
23-09-2007> 13:53> > > > ___> fis 
mailing list> fis@listas.unizar.es> 
http://webmail.unizar.es/mailman/listinfo/fis
_
Votez pour vos séries TV préférées et tentez de gagner un voyage à Hawaï !
http://messengerawards.divertissements.fr.msn.com/___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


SV: [Fis] info & meaning

2007-10-04 Thread Søren Brier
Dear Pedro

Do I understand you right when I see your models as:

1. There is no meaning in inanimate nature.
2. Meaning is constructed on a first level by life in the form of single
cell life forms.
3. Second level is (chemical) communication between cells.
4. Third level is multicellular organisms as species with a gene pool.
5. Fifth level is their communication.
6. Sixth level human construction of meaning in 'life worlds'. 

But there is no object of meaning in itself. Energy and mathematical
information are the basic reality. First person meaningful consciousness is
a bio-cultural artifact useful for the construction of life and culture, but
it is not an image of anything real.

Best wishes

Søren


-Oprindelig meddelelse-
Fra: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] På
vegne af Pedro Marijuan
Sendt: 4. oktober 2007 14:23
Til: fis@listas.unizar.es
Emne: Re: [Fis] info & meaning

Dear colleagues,

What if meaning is equivalent to "zero"?

I mean, if we backtrack to the origins of zero, we find those obscure
philosophers related to Buddhism in India, many centuries ago (Brahmagupta,
600 ad). It was something difficult to grasp, rather bizarre, the fruit of
quite a long and winding thought, and frankly not of much practicity. Then
after not many developments during a few centuries, another scholar in
central Asia (al-Kwarismi) took the idea and was able to algorithmize the
basic arithmetic operations. Mathematics could fly... and nowadays any
school children learns and uses arithmetics & algebra so easily.

The idea is that if we strictly identify (we "zero" on) meaning as a
biological construct, work it rigorously for the living cell as a tough
problem of systems biology (and not as a flamboyant autopoiectic or
autogenic or selftranscence doctrines of Brahmaguptian style), then we work
for a parallel enactive action/perception approach in neuroscience, and
besides pen a rigorous view in social-economic setting under similar
guidelines --and also find the commonalities with quantum computing and
information physics...  finally information science will fly.

Otherwise, if we remain working towards the other direction, the
undergrounds of zero downwards, we will get confined into bizarre,
voluminous, useless discussions & doctrines on information. Cellular meaning
is our zero concept: we should go for it.

best

Pedro



   

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Internal Virus Database is out-of-date.
Checked by AVG Free Edition. 
Version: 7.5.488 / Virus Database: 269.13.30/1025 - Release Date: 23-09-2007
13:53
 

Internal Virus Database is out-of-date.
Checked by AVG Free Edition. 
Version: 7.5.488 / Virus Database: 269.13.30/1025 - Release Date: 23-09-2007
13:53
 


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-04 Thread Pedro Marijuan

Dear colleagues,

What if meaning is equivalent to "zero"?

I mean, if we backtrack to the origins of zero, we find those obscure 
philosophers related to Buddhism in India, many centuries ago (Brahmagupta, 
600 ad). It was something difficult to grasp, rather bizarre, the fruit of 
quite a long and winding thought, and frankly not of much practicity. Then 
after not many developments during a few centuries, another scholar in 
central Asia (al-Kwarismi) took the idea and was able to algorithmize the 
basic arithmetic operations. Mathematics could fly... and nowadays any 
school children learns and uses arithmetics & algebra so easily.


The idea is that if we strictly identify (we "zero" on) meaning as a 
biological construct, work it rigorously for the living cell as a tough 
problem of systems biology (and not as a flamboyant autopoiectic or 
autogenic or selftranscence doctrines of Brahmaguptian style), then we work 
for a parallel enactive action/perception approach in neuroscience, and 
besides pen a rigorous view in social-economic setting under similar 
guidelines --and also find the commonalities with quantum computing and 
information physics...  finally information science will fly.


Otherwise, if we remain working towards the other direction, the 
undergrounds of zero downwards, we will get confined into bizarre, 
voluminous, useless discussions & doctrines on information. Cellular 
meaning is our zero concept: we should go for it.


best

Pedro



  


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-09-30 Thread Stanley N. Salthe
I comment on Walter's remark concerning my posting
>Dear FIS colleagues,
-snip-
>
>I can’t see the appropriateness to reinsert the teleological language.
>Moreover, it seems to me that with these reinsertions we could lose, at
>certain degrees, the rational and scientific >enterprise.
>
>Also, indirectly it is given material for the “intelligent-design” people.

S: Since final causes have been present in variational principles for a
long time without destroying the scientific enterprise, or its usefulness
in technoloical support, I cannot see the point of this comment against
finality.

As for the connection to ID, this ststement of Walter's presupposes that
finality can only be associated to intentionality.  This has long been
known to be untrue.  To whit: purpose has been generalized to function in
biology, and has been still further been generalized to physical propensity
in the physico-chemical realm.  Thus, we have {physical propensity {
function {purpose}}}, or {teleomaty {teleonomy {teleology}}}.

As well, concerning Loet's
>I would go there with  Maturana, for example, on the point that living
>systems communicate in terms of  >molecules, while molecules, for example,
>communicate in terms of atoms (in a  chemical evolution), etc.  There >can
>be a lot of information which is meaningless  (e.g., noise) and in very
>meaninful configuration there may be >little  communication of information
>ongoing.

 S: This relates to the Scale Hierarchy, e.g., [organism [cell
[macromolecule]]], as explored in detail in my 1985 book, Evolving
Hierarchical Systems, and some subsequent texts.  As I mention in my
comments on Bob's text, information at any scale is meaningless, AS SUCH,
at different scales, but may be detected indirectly inasmuch as these
meanings at other scales were involved in generating boundary conditions
from higher scalar levels and initiating conditions from lower scalar
levels upon the scale in focus.

STAN




>Sincerely,
>
>
>Walter
>
>
>
>
>
>___
>fis mailing list
>fis@listas.unizar.es
>http://webmail.unizar.es/mailman/listinfo/fis




___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


RE: [Fis] info & meaning

2007-09-30 Thread Loet Leydesdorff
wledge Base in Inter-human Communication Systems,
Canadian Journal of Communication 28(3), 267-289 (2003); <
<http://www.leydesdorff.net/incursion/incursion.pdf> pdf version>. My
conclusion is that there is still a long way to go in this research program
and that unlike yours it is not confined to the biological domain because of
the more abstract definitions. 
 
With best wishes, 
 
 
Loet
  _  

Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
 <mailto:[EMAIL PROTECTED]> [EMAIL PROTECTED] ;
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 

 
Now available:
<http://www.universal-publishers.com/book.php?method=ISBN&book=1581129378>
The Knowledge-Based Economy: Modeled, Measured, Simulated. 385 pp.; US$
18.95 
 <http://www.universal-publishers.com/book.php?method=ISBN&book=1581126956>
The Self-Organization of the Knowledge-Based Society;
<http://www.universal-publishers.com/book.php?method=ISBN&book=1581126816>
The Challenge of Scientometrics

 
 


  _  

From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of bob logan (by way of Pedro Marijuan<[EMAIL PROTECTED]>)
Sent: Friday, September 21, 2007 2:50 PM
To: fis@listas.unizar.es
Subject: [Fis] info & meaning


12th FIS Discussion Session: ON INFORMATION AND MEANING



"The Relativity of Information and Its Relationship to Materiality, Meaning
and Organization" 

 

Robert K. Logan
Department of Physics
University of Toronto 




This is an executive summary of the attached paper. It is too short to do
justice to the subject, so the FIS reader is encouraged to read the whole
article. A number of interesting questions will be raised in this article
which examines the nature of information and its relationship to meaning,
organization and materiality. The following questions will be addressed.
 
Is there only one form of information or are there several kinds of
information? In other words is information an invariant or a universal
independent of its frame of reference or context? 
 
What is the relationship of information to meaning and organization? 
 
Is information a thing like a noun or a process like a verb? 
 
Is information material or is it a form of energy or is it just a pattern? 
 
Is information a uniquely human phenomenon or do non-human forms of life
contain information? 
 
The origin of the term information is examined in the full paper. Shannon's
definition of information as entropy using the formula H = - pi log pi is
presented. It is noted that Wiener used the same definition with the
opposite sign. Shannon information has nothing to do with meaning and is
only concerned with how accurately a string of symbols is transmitted from
point A to B. MacKay critiqued the Shannon definition of information and
argued that he did not see "too close a connection between the notion of
information as we use it in communications engineering and the determination
of the semantic question of what to send and to whom to send it." He
suggested that information should be defined as "the change in a receiver's
mind-set, and thus with meaning" and not just the sender's signal (Hayles
1999b, p. 74). His version of information unfortunately did not survive
because it was deemed by the reductionists as too subjective. Bateson (1973)
famous definition of information as "the difference that makes a difference"
which actually is derived from MacKay's earlier assertion that "information
is a distinction that makes a difference" introduces the importance of
meaning in understanding information. 
 

Information in Biotic Systems

We next review the work of Propagating Organization: An Enquiry (POE)
(Kauffman, Logan et al. 2007) in which it is shown that Shannon information
cannot describe a biotic system. The core argument of POE was that Shannon
information "does not apply to the evolution of the biosphere" because
Darwinian preadaptations cannot be predicted and as a consequence "the
ensemble of possibilities and their entropy cannot be calculated. Instead of
Shannon information we defined a new form of information, which we called
instructional or biotic information, not with Shannon, but with constraints
or boundary conditions. The amount of information will be related to the
diversity of constraints and the diversity of processes that they can
partially cause to occur.
 

The Relativity of Information
 
In POE we associated biotic or instructional information with the
organization that a biotic agent is able to propagate. This contradicts
Shannon's definition of information and the notion that a random set or soup
of organic chemicals has more Shannon information than a structured and
organized set of organic chemicals found in a living organism. This argument
completely contradicts the notion

Re: [Fis] info & meaning

2007-09-28 Thread Pedro Marijuan

-- Forwarded message --
From: karl javorszky <<mailto:[EMAIL PROTECTED]> 
[EMAIL PROTECTED]>

Date: 25.09.2007 15:54
Subject: Re: [Fis] info & meaning
To: bob logan <<mailto:[EMAIL PROTECTED]> 
[EMAIL PROTECTED]>, Pedro Marijuan 
<<mailto:[EMAIL PROTECTED]>[EMAIL PROTECTED] >, 
<mailto:fis@listas.unizar.es>fis@listas.unizar.es



Dear Bob and FIS,

let us further clarify the object of these discussions:
+ we discuss the concept of information;
+ we discuss how the idea of information shows itself in sociology, 
biology, medicine, etc.;
+ we discuss how information is understood in a relational context within 
its context;
+ we discuss how information is different in one set of relations to an 
information contained in a different set of relations.


We do NOT discuss here:
+ the concept of information is beyond human understanding;
+ we have no idea what information is;
+ there is no numeric-logical definition of information;
+ we don't want to confront information as a numeric task.

The whole point of FIS was so far to reach an agreement on:
* there is a well-circumscribed concept of information,
* information is that what we neglect as we conduct an addition,
* using that what we neglect as we conduct an addition as a logical 
ordering principle we can very well calculate and relate information,
* using that what we neglect while we conduct an addition as a logical 
ordering principle a multitude of logical truths, numeric constants and 
mathematical functions appear,
* the collection of logical constants which appear as we use as an ordering 
principle that what we ususally neglect while conducting an addition 
resebles very much that what is called the periodic system of elements,
* quite many of the functions and constants which become apparent as one 
orders and reorders the numbering system according to that what we neglect 
as we conduct an addition resemble very much such fucntions and constants 
and procedures as known from books on Physics, Chemistry and Genetics.


It is funny that the task itself got out of focus in this discussion. We 
should not lament about the lack of a consistent function to measure 
information with but start using seriously that what has been evolved and 
poresented within FIS.


To repeat:
Information is that what we neglect as we conduct an addition.

In the more detailed definition:
1. Information is the relation of symbols to their meaning on both sides of 
an equivalence sign.

Expl.1:
3+4=5+2
contains the following amount of information.
step 1: establish number of symbols
there are 4 symbols in this equivalence.
step 2: establish meaning of symbols
there is (3+4)=7 on the left side and (5+2)=7 on the right side, therefore 
the meaning of this equivalence is 2*7 that is 14.

step 3: divide meaning by number
the meaning of 14 is divided by the number of symbols, that is 4. The 
result is 14/4, 7/2, that is 3.5.

The information content in "3+4=5+2" is 3.5.

2. The tautology's information is the tautology itself
3=3 has the info content of 6/2=3

3. The unit-type counting understates the information content of an equivalence
1+1+1+1+1=5 has the info content of 10/6. Generally, n=1+1+1+1+...+1 has 
the information content of 2n/(n+1).


4. Information is both a scalar and an indexed value
The information  S1=3 (symbol 1 has the value 3) is a fixed scalar in the 
unit-type counting. It is an indexed value in the relational counting, 
because the information lies in (and depends from) many other properties:
+ in which unit-type result equivalence is S1 included (what is the 2n 
being stated on both sides);

+ in which side is S1 included (3+3+3+3=1+1+1+1+1+1+1+1+1+3);
+ how many symbols;
+ how different the symbols;
+ which smallest symbol;
+ level of truth
and so forth.

5. Information density is material density
That what we consider "material" is certainly the truest form of logical 
existence. Relative to this extent of truth, there exist lesser true forms 
of logical existence. "Material" is both true and certain to a specific 
degree, "space" is true and certain to a specific pair of degrees, "forces" 
are differently true and differently certain to those describing a "field".
There exists a subset among the subsets of the set of true sentences which 
is certain and true to the maximal extent.


Let us build on these common understandings. The definitions presented here 
agree to the dot to the predictions of Shannon. This is a probability 
function free of dimensions and it can be used to predict (actually, the 
size of the average symbol, that is, the average size of matter as a 
mathematical entity) those structural constants which we use in our 
perception and cognition as background, which evolve as the results of the 
interaction between the traditional interpretation of the term 
"mathematical" in this sentenc

Re: [Fis] info & meaning

2007-09-27 Thread walter\.riofrio
Dear FIS colleagues, 

To propose the possible situations for the emergence of new properties (like 
information and functions), in where the meaning here is nothing but a 
condition of simple physical events such that the 'edge of chaos' condition is 
not destroyed; it is not incompatible with the current uses of information, I 
think. 

I can’t see the appropriateness to reinsert the teleological language. 
Moreover, it seems to me that with these reinsertions we could lose, at certain 
degrees, the rational and scientific enterprise. 

Also, indirectly it is given material for the “intelligent-design” people. 


Sincerely,


Walter





___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


[Fis] Info & meaning

2007-09-25 Thread Stanley N. Salthe
First I comment on Pedro's:

>The "information overload" theme (in the evolution of social modes of
>communication), is really intriguing. >Should we take it, say, in its
>prima facie? I am inclined to put it into question, at least to have an
>excuse and try to >unturn that pretty stone...

The question of information overload connects with my theory of senescence
(1993 book Development & Evolution: Complexity and Change in Biology),
which is that it results from the continual taking in of information after
a system has become definitive (all material systems necessarily continue
to get marked), while at the same time there is less loss of information
(matter is 'sticky').  The result is that a system becomes slower to
respond to perturbations, atr least because lag times increase as a result
of a multiplication of channels (increased informational friction), and
because some channels effect responses at cross purposes.


and then
Walter's:

>(2) In the same way: how can arise the 'meaning' In naturalist terms or
>imposed by us?

I have recently concluded that meaning can be naturalized by aligning it
with finality in the Aristotelian causal analysis (material/formal,
efficient/final).  I assert that final cause has not really been eliminated
from physics and other sciences, but has been embodied in variatinal
principles like the Second Law of thermodynamics, and in evolutionary
theory - in Fisher's fundamental theorem of natural selection, as well as
in some interpetations of the 'collapse of the wave function' in QM.

STAN


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-09-25 Thread walter\.riofrio

Dear Bob, Pedro and colleagues:

This is my first message in FIS list.
My name is Walter Riofrio, Associate Professor of Peruvian University 
Cayetano Heredia in Lima-Peru.  My research interests are:
(1) Theoretical and Conceptual Analysis in Biology and Natural Sciences; 
(2) Nature of Biological Evolution; (3) Emergence of basic properties in 
Living Systems; (4) Relationships between Biological Functions, Biological 
Information and Network of Molecular Processes; (5) Theoretical Studies in 
Architecture of the Brain and its relations with the Emergence of Cognition 
in Evolution; (6) The Neural and Temporal Code linked to the Emergence of 
Mental Properties.


I would like to participate in the discussion of proposals on “biotic 
interpretation of information” in POE.
I agree with Pedro about the statement “The origins of life…must be 
explained…by means of the highest power or upper hand in science: the 
experimental.”


But I would add the following thing: We still need the conceptual 
development (that it is not so easy task) and the respective translations 
in the design of interesting, elegant and outstanding experiments with 
which not only continue the accumulation of more or less dispersed data but 
concrete security we are obtaining explanations on this topic.


Actually, I have a lot of questions because of my researches are related 
with some issues in the paper (although, I only put some of them here):


(1) To understand adequately one of the main claims, mainly, equaling 
information (biotic or instructional information) with constraints I 
wondering if the notion “propagation organization” adds something more to 
the notion of “reproduction”


(2) In the same way: how can arise the “meaning” In naturalist terms or 
imposed by us? If is the former, how the living beings process the “meaning 
of ‘some’ information” in themselves?


One can therefore argue that since the meaning of 
instructional information is propagating organization that we finally 
understand the meaning of life – the “meaning of life” is propagating 
organization.


(3) My last question, it is due that I have works in this issue: the nature 
of biological information. And my main working thesis is that I claim that 
it is still possible to construct in “Naturalized Normative ways” notions 
like Information, Function and Autonomy. Moreover, that these three 
properties are the most basic properties of life.


Briefly stated, I consider the autonomy a property (global property) which 
is the result of the first emergence of information and function (in the 
local interactions between the processes in an interrelated molecular 
network).
My working hypothesis is about the origins of prebiotic phenomena and I 
postulate that the system which opened these doors were the “Informational 
Dynamic Systems” (IDS).


This is the one that caused the necessary and sufficient conditions to set 
off the succession of certain phenomena of the universe towards the 
possibility of the emergence of biological systems. It is also the system 
that might have been the driving force to the beginning of the move from 
the inanimate world to the animate one.


My theoretical proposal visualizes the emergence of information and 
biological function in a sort of coordinated origin – at the same time – in 
the local processes of these types of systems.


This visualization of the physical emergence of Information and Function 
gives us to talk about the notion of “Information-Function”; each time we 
observe functions happening in these systems’ processes, we can be sure 
that some type of information has been transmitted through them. Likewise, 
when we observe the transmission of some type of information among the 
process, we can be sure that some function has been produced among them.


It means that the kind of organization in these systems (and, of course, in 
the living beings) is an “Informational and Functional Dynamic Organization”.


If someone would like to read more of my proposals, there are three links:

- One chapter in a book:
(A) Riofrio, Walter (2007a) Informational Dynamic Systems: Autonomy, 
Information, Function. In Worldviews, Science, and Us: Philosophy and 
Complexity, edited by C. Gershenson, D. Aerts, and B. Edmonds. World 
Scientific, Singapore, pp. 232-249.
Link of a pre-published 
version: 
http://wriofrio.aboutus.gs/Papers/PyC-WR.pdf 



- Two works which develops some consequences of my main claims (in 
the aforementioned paper) applied to the relationships between the 
cytoarchitectute of the brain and the emergence of cognition in evolution:


(B) Riofrio, W. and Aguilar, L.A. (2006) Different Neurons Population 
Distribution correlates with Topologic-Temporal Dynamic Acoustic 
Information Flow. InterJournal Complex Systems  # [1619].


http://wriofrio.aboutus.gs/Papers/AcInFl-WR.pdf

(C) R

Re: [Fis] info & meaning

2007-09-24 Thread Pedro Marijuan

Dear Bob and colleagues,

Thanks for the scholarly work. It was nice reading the vast landscape, 
historical and otherwise, you have covered around that new biotic 
interpretation of information as "Propagating Organization" or POE. Perhaps 
in this list we have already arrived to a truce on Shannonian matters 
--notwithstanding a general agreement with your criticisms on that respect, 
the problem may be a lack of interesting alternative generalizations, 
inclusive enough so that the Shannonian theory gets its proper place as a 
great theory of communication and as a great theory of physical structures 
(in the guise of "physical information theory") and also as a great theory 
in data analysis.


Thus I am sympathetic with integrative quests such as POE, but given that 
disagreeements are usually the most valuable stuff for discussions, I would 
raise the following points:


 --1. Constraints and boundary conditions are conflated (see below), with 
some more emphasis on the former. Thus, taking into account that most 
chemical constraints are in the form of "activation energies" and "free 
energies" which establish the kinetics of the multitude of chemical 
reactions and molecular transformations in the cell, How this heterogeneous 
collection of variables and parameters may receive a form of global 
treatment as POE seems to imply? I have seen no hints in the paper.


 ...we defined a new form of information, which we called instructional 
or biotic information, not with Shannon, but with constraints or boundary 
conditions. The amount of information will be related to the diversity of 
constraints and the diversity of processes that they can partially cause 
to occur.



2. The statement below is ambiguous. E. Coli for instance may contain in 
the order of 1 or 2 million molecules (different from water) of 5,000 or 
10,000 different classes. The amount of Shannon info is far, far higher in 
the living soup than any inorganic soup.


...This contradicts Shannon’s definition of information and the notion 
that a random set or soup of organic chemicals has more Shannon 
information than a structured and organized set of organic chemicals found 
in a living organism.



3. The origins of life (implicit in the text below) must be explained not 
just by means of some formal approach more or less interesting, but by 
means of the highest power or upper hand in science: the experimental. I 
remember during late 80's and early 90's how different approaches in 
artificial life were claiming "explaining away" the "logical" part of the 
bio matter (Langton, conspicuously) and being able to put it into the 
computer cavalierly...


 Kauffman (2000) has described how this organization emerges through 
autocatalysis as an emergent phenomenon with properties that cannot be 
derived from, predicted from or reduced to the properties of the 
biomolecules of which the living organism is composed and hence provides 
an explanation of where biotic information comes from.





4. Going back to MacKay ("distinction that makes a difference") to 
readdress the Shannonian overextension, looks a very nice note to me. 
Independently I had posted here in this list a few years ago an approach to 
info as "distinction on the adjacent". The term "distinction" was following 
some previous work in the logics of multidimensional partitions as 
discussed by Karl (also in this list).


5. The info analysis of life might demand a few other info categories. 
Three info genera were discussed years ago by myself and other 
patries---structural, generative, communicational. It would be too long a 
discussion, the matter may be that bioinformational approaches are a very 
promising avenue to offer more integrated approaches to the info 
phenomenon. However, another exciting avenue is information physics / 
quantum information. Without discarding breakthroughs in other fields, 
these two branches may provide the basics of a new info perspective, say.


6. In the approach to cultures and societies (in the paper), I think we 
have to recognize a black hole in the territories of the neurosciences. We 
may call it "human nature", "theory of mind", "central theory of the 
neurosciences" or whatever. But without filling that void, it is very 
probable that the info synthesis above mentioned could not occur.


Anyhow, we have also the "info overload" theme of weeks ago. Quite a bit!

best regards

Pedro

=
Pedro C. Marijuán
Cátedra SAMCA
Institute of Engineering Research of Aragon (I3A)
Maria de Luna, 3. CPS, Univ. of Zaragoza
50018 Zaragoza, Spain
TEL. (34) 976 762761 and 762707, FAX (34) 976 762043
email: [EMAIL PROTECTED]
= ___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis