Re: [Fis] info meaning

2007-10-12 Thread bob logan
Loet et al - I guess I am not convinced that information and entropy  
are connected. Entropy in physics has the dimension of energy divided  
by temperature. Shannon entropy has no physical dimension - it is  
missing the Boltzman constant. Therefore how can entropy and shannon  
entropy be compared yet alone connected?


I am talking about information not entropy - an organized collection  
of organic chemicals must have more meaningful info than an  
unorganized collection of the same chemicals.



On 11-Oct-07, at 5:34 PM, Loet Leydesdorff wrote:

Loet - if your claim is true then how do you explain that a random  
soup of
organic chemicals have more Shannon info than an equal number of  
organic
chemicals organized as a living cell where knowledge of some  
chemicals
automatically implies the presence of others and hence have less  
surprise

than those of the  soup of random organic chemicals? -  Bob


Dear Bob and colleagues,

In the case of the random soup of organic chemicals, the maximum
entropy of the systems is set by the number of chemicals involved (N).
The maximum entropy is therefore log(N). (Because of the randomness of
the soup , the Shannon entropy will not be much lower.)

If a grouping variable with M categories is added the maximum entropy
is log(N * M). Ceteris paribus, the redundancy in the system increases
and the Shannon entropy can be expected to decrease.

In class, I sometimes use the example of comparing Calcutta with New
York in terms of sustainability. Both have a similar number of
inhabitants, but the organization of New York is more complex to the
extent that the value of the grouping variables (the systems of
communication) becomes more important than the grouped variable (N).
When M is extended to M+1, N possibilities are added.

I hope that this is convincing or provoking your next reaction.

Best wishes,


Loet


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info meaning

2007-10-12 Thread Guy A Hoelzer
Bob,

If the notions of Entropy and Shannon Information are alternative approaches
to characterize the same phenomenon in Nature, then the ways they have been
modeled would not necessarily reveal an underlying and fundamental
commonality.  I think that many of us suspect that this is the case and we
are striving to understand and articulate (model) the fundamental
commonality.  We may be wrong, of course, but your argument doesn't dissuade
me.  In fact, I must admit sheepishly that I'm not sure how one would go
about analyzing the relationship between these ideas in a way that could
dissuade me.  If such an analysis is not possible, then I suspect that the
question of a fundamental commonality will have to die slowly as progress
fails to occur.

Regards,

Guy


on 10/12/07 12:01 PM, bob logan at [EMAIL PROTECTED] wrote:

 Loet et al - I guess I am not convinced that information and entropy
 are connected. Entropy in physics has the dimension of energy divided
 by temperature. Shannon entropy has no physical dimension - it is
 missing the Boltzman constant. Therefore how can entropy and shannon
 entropy be compared yet alone connected?
 
 I am talking about information not entropy - an organized collection
 of organic chemicals must have more meaningful info than an
 unorganized collection of the same chemicals.
 
 
 On 11-Oct-07, at 5:34 PM, Loet Leydesdorff wrote:
 
 Loet - if your claim is true then how do you explain that a random
 soup of
 organic chemicals have more Shannon info than an equal number of
 organic
 chemicals organized as a living cell where knowledge of some
 chemicals
 automatically implies the presence of others and hence have less
 surprise
 than those of the  soup of random organic chemicals? -  Bob
 
 Dear Bob and colleagues,
 
 In the case of the random soup of organic chemicals, the maximum
 entropy of the systems is set by the number of chemicals involved (N).
 The maximum entropy is therefore log(N). (Because of the randomness of
 the soup , the Shannon entropy will not be much lower.)
 
 If a grouping variable with M categories is added the maximum entropy
 is log(N * M). Ceteris paribus, the redundancy in the system increases
 and the Shannon entropy can be expected to decrease.
 
 In class, I sometimes use the example of comparing Calcutta with New
 York in terms of sustainability. Both have a similar number of
 inhabitants, but the organization of New York is more complex to the
 extent that the value of the grouping variables (the systems of
 communication) becomes more important than the grouped variable (N).
 When M is extended to M+1, N possibilities are added.
 
 I hope that this is convincing or provoking your next reaction.
 
 Best wishes,
 
 
 Loet
 
 ___
 fis mailing list
 fis@listas.unizar.es
 http://webmail.unizar.es/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info meaning

2007-10-12 Thread John Collier

At 09:01 PM 12/10/2007, bob logan wrote:

Loet et al - I guess I am not convinced that information and entropy
are connected. Entropy in physics has the dimension of energy divided
by temperature. Shannon entropy has no physical dimension - it is
missing the Boltzman constant. Therefore how can entropy and shannon
entropy be compared yet alone connected?


Bob,

Temperature has the dimensions of energy per degree of freedom.
Do the dimensional analysis, and you end up with a measure in
degrees of freedom. This is a very reasonable dimensionality
for information.


I am talking about information not entropy - an organized collection
of organic chemicals must have more meaningful info than an
unorganized collection of the same chemicals.


I am planning to make some general comments of meaning, but
I am too busy right now. They will have to wait for later. There are
some very tricky issues involved, but I will say right now that
information is not meaningful, but has only a potential for meaning.
All information must  be interpreted to be meaningful, and the same
information can have very different meanings depending on how it
is interpreted. Information, on the other hand, has an objective
measure independent of interpretation, and that depends on
the measure of asymmetry within a system. See the recent book
by my student Scott Muller for details,  Asymmetry: The Foundation
of Information, Springer 2007.

http://www.amazon.ca/Asymmetry-Foundation-Information-Scott-Muller/dp/3540698833

This whole discussion on meaning needs far more precision and a
lot of garbage collecting.

Cheers,
John




--
Professor John Collier [EMAIL PROTECTED]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://www.ukzn.ac.za/undphil/collier/index.html  


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info meaning

2007-10-12 Thread bob logan
hi stan - I think we need to agree to disagree - for me final cause  
involves purpose or intention. A gas that expands into a larger  
volume has no intention or purpose and therefore one cannot talk  
about final cause because there is no agent - I did enjoy your  
argument but I just cannot buy it - Bob



On 12-Oct-07, at 3:01 PM, bob logan wrote:

Loet et al - I guess I am not convinced that information and  
entropy are connected. Entropy in physics has the dimension of  
energy divided by temperature. Shannon entropy has no physical  
dimension - it is missing the Boltzman constant. Therefore how can  
entropy and shannon entropy be compared yet alone connected?


I am talking about information not entropy - an organized  
collection of organic chemicals must have more meaningful info than  
an unorganized collection of the same chemicals.



On 11-Oct-07, at 5:34 PM, Loet Leydesdorff wrote:

Loet - if your claim is true then how do you explain that a  
random soup of
organic chemicals have more Shannon info than an equal number of  
organic
chemicals organized as a living cell where knowledge of some  
chemicals
automatically implies the presence of others and hence have less  
surprise

than those of the  soup of random organic chemicals? -  Bob


Dear Bob and colleagues,

In the case of the random soup of organic chemicals, the maximum
entropy of the systems is set by the number of chemicals involved  
(N).
The maximum entropy is therefore log(N). (Because of the  
randomness of

the soup , the Shannon entropy will not be much lower.)

If a grouping variable with M categories is added the maximum entropy
is log(N * M). Ceteris paribus, the redundancy in the system  
increases

and the Shannon entropy can be expected to decrease.

In class, I sometimes use the example of comparing Calcutta with New
York in terms of sustainability. Both have a similar number of
inhabitants, but the organization of New York is more complex to the
extent that the value of the grouping variables (the systems of
communication) becomes more important than the grouped variable (N).
When M is extended to M+1, N possibilities are added.

I hope that this is convincing or provoking your next reaction.

Best wishes,


Loet


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


[Fis] Info, meaning narrative

2007-10-12 Thread Ted Goranson
Message forwarded from Beth Cardier. I understand It should have 
appeared a few days ago but for subscription difficulties.


Her email is [EMAIL PROTECTED]

-Ted
+++

Dear FIS colleagues,

My name is Beth Cardier, and I am currently researching narrative as 
information at Melbourne University. I haven't contributed to this 
list before but I have been watching it since I attended the FIS 
conference in Paris. In response to Ted's call for a hybrid space, I 
thought I'd share some of the ways that the ideas raised in this 
thread overlap with a few key concepts from the arts, as well as some 
of ways the logics differ. I also happen to know a lot about 
newspapers, which have recently entered the discussion.


As a writer and media analyst, I don't deal with the conveyance of 
meaning numerically, but I suspect there could be a geometric way to 
understand narrative. For me, it seems that the first difference 
between an artistic and scientific notions of information is that 
artistic logic doesn't operate in terms of context and content n 
there is no container and object. Perhaps linguistic principles are 
responsible for the traditional idea of object and link, noun and 
verb. Instead, storytelling deals in situations, carrying an 
ontology in which participants are elements of context, because they 
compose the situation.


I define a situation as you would expect, as a cluster of embodied 
spaces in the actual world. In a story, aspects of a situation are 
represented as images. Images are promiscuous in terms of what they 
can represent - not only tangible objects, but emotional sensations, 
philosophical positions, abstract impressions or metaphysical 
wonderings. They can be expressed via nouns or verbs, or both, or 
neither. They can also be expressed in the way the words are placed 
in relation to each other, or the frequency with which certain words 
occur. This is why linguistic descriptions of how words are related 
are not as important to me as understanding how images are related. 
My currency as a storyteller n the way I create meaning n lies in 
the association of images. I fit them together to create represented 
situations. Images are clustered into networks, like stones that join 
to form virtual pavements, interpermeated and overlapping.


The situated nature of narrative information means that it is 
conditional, and no element is absolute in its identity. Indeed, the 
relativity of information is the subject of this FIS thread. But 
there is an additional quality to artistic information that you might 
find interesting. In narrative, when information is added 
incrementally, the identity of the entire system changes, and so 
isn't reducible to its parts. Let me give a crude example from Media 
Analysis, an activity that mines news reports for dominant narrative 
patterns. It should be noted that this example is noun-oriented for 
ease.


When an analyst determines the public relations implications of a 
news story, one method is to look at the images present. For example, 
a news article that contains the words president, car accident 
and broken leg conveys one sort of story (1). An article containing 
the words president, car accident, broken leg, and woman 
passenger is a slightly different story (2). Watch how the story 
changes when the list of images becomes: president, car accident, 
broken leg, woman passenger, woman not president's wife, woman dead 
(3).


Even though these reports share identical elements, they have three 
different meanings, and not only because the story is longer. For 
example, in each version, the role played by the image broken leg 
changes. In story 1, concern about the president's broken leg relates 
to the prestige and importance of the office he holds. In story 2, 
there are still suggestions of this concern, but the broken leg is 
also the reason we know about his trip with a female passenger (an 
analyst would recommend PR intervention). In story 3, the president 
is responsible for such a bad situation that the broken leg is almost 
a deserved punishment, the beginning of a much larger punishment to 
come (an analyst would recommend a lawyer).


Of course there can be slightly different readings for these three 
stories, depending on the finer links between their images, although 
that sort of variation tends to be more common in fiction. There is 
actually not much variety in news reports, their recycled templates 
being part of what makes them fast to write and read.


This begins to relate to Steven's and Christophe's observations about 
newspapers and interpretation. As you might guess, I see pattern as 
an agent in terms of information and meaning. In artistic terms, two 
patterns draw out each other's shared features simply by having 
similar structures. Perhaps this also speaks to Walter's thoughts 
about information transference. If I tell a story about a dying 
friend, and you have lost a friend in a similar way, the factors of 
your