RE: [Fis] definitions of information

2007-11-10 Thread Stanley N. Salthe
Pedro said --

  Dear FIS colleagues,

 Adding to Bob's and Karl's on Shannonian info, I am still under the
influence of Seth Lloyd (one of the founders of quantum computation)
insights about inf physics. For him, the second law is but a statement
about information processing, how the underlying physical dynamic laws of
the universe preserve bits and prevent their number for decreasing.
Landauer's principle connect it with erasure... (and temperature becomes
energy per bit). Anyhow, some of Karl's releted statements should be put
into test --first, by establishing empirically the number of
multidimensional partitions, a crucial point in my view).

 Then, on Stan  Loet about semiosis, I civilizedly disagree. Perhaps I
should have written my ten points more universally (they were put mainly
around the street lamp of biology), but the central argument  is clear:
in which place there is more generality concerning wholistic information
(which for instance comprises: generation, coding, emision, communication
channel, reception, decoding, meaningful interpretation, etc.), either in
human language or in the bioinformational realm?
 S: In my view the situation is quite clear, given that the human
(sociocultura)l realm developed out of the biological realm.  From that
point of view, human semiosis must be a later development in the universe
than biosemiosis.  Thus, biosemiosis is more generally present throughout
nature than human semiosis.  Then, since we discover our semiotic
principles by studying human semiosis, it is natural to view biosemiosis as
a generalization of human semiosis.  Thus, I do not see any disagreement
with Pedro when he continues:.

 That's the question. Very shortly, I would bring three arguments on the
primacy of the latter: evolutionary (real origins), ontogenetic
(developmental process), and formal (Robert Rosen's train of thought about
physical/biological systems and degeneracy in Life itself ).
 but then Pedro continues:
 Otherwise, by straitjacketing the global discussion of info into some
particular semiotic or pansemiotic school, we are lead into cul-de-sacs
with different decorations.  As often stated in this list, we need new
thought, a new info synthesis.
 S: Now here Pedro seems to be rejecting the particular semiotic
theoretical framework that most semioticians (and particularly all
biosemiotians) use -- the Peircean triadic framework.  This rejection may
be justified, but it would be nice to know what is being suggested as a
framework instead.  It can be said (I think -- maybe I'm wrong) that
Peircean semiotics has not yet been integrated with information theory.  I
think the relations here would likely be {information theory {Peircean
semiotics}}, with a reformulation of semiotics under the general rules of
informatoin theory.

STAN

 best regards

 Pedro

 PS. By the way, a famous paper (a talk initially) by Lloyd on 31
Measures of Complexity may be a good idea for our info field too. This is
a suggestion addressed to Dail and other collegues of the nascent info
institute.



 = Pedro C. Marijuán Cátedra
SAMCA Institute of Engineering Research of Aragon (I3A) Maria de Luna, 3.
CPS, Univ. of Zaragoza 50018 Zaragoza, Spain TEL. (34) 976 762761 and
762707, FAX (34) 976 762043 email: [EMAIL PROTECTED]
=
___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis




___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


RE: [Fis] more thoughts about Shannon info

2007-11-10 Thread Loet Leydesdorff
Dear Bob and colleagues:

Energy spontaneously tends to flow only from being concentrated in one place 

to becoming diffused or dispersed and spread out. And energy is governed by the 
first law of thermodynamics that states that energy cannot be destroyed or 
created.

Is there an equivalent 1st and  2nd law for information? 

Yes, there is. The proof of the non-negativity of the information expectation 
can be found at pp. 59f. of Henry Theil, Statistical Decomposition Analysis. 
Amsterdam: North-Holland, 1972. 

Entropy is used to describe systems that are undergoing dynamic interactions 
like the molecules in a gas. What is the analogy with Shannon entropy or 
information?

Is Shannon’s formula really the basis for a theory of information or is it 
merely a theory of signal transmission? 

The issue is what you mean with really: historically, it was only a theory of 
signal transmission. However, it can be further elaborated into a theory of 
information.  

Thermodynamic entropy involves temperature and energy in the form of heat, 
which is constantly spreading out. Entropy S  is defined as ∆Q/T. What are the 
analogies for Shannon entropy?  

The analogy with the Shannon entropy is strictly formal. Shannon's is a 
mathematical theory; bits of information are dimensionless. The 
Boltzman-constant (k(B)) provides dimensionality to S. Thermodynamic entropy 
can be considered as a special case of Shannon entropy, from this perspective. 
Thermodynamics can thus be considered as a special case of non-linear dynamics 
from this perspective. One needs physics as a special theory for its 
specification. 

There is the flow of energy in thermodynamic entropy but energy is conserved, 
i.e. it cannot be destroyed or created.

There is the flow of information in Shannon entropy but is information 
something that cannot be destroyed or created as is the case with energy? Is it 
conserved? I do not think so because when I share my information with you I do 
not lose information but you gain it and hence information is created. Are not 
these thoughts that I am sharing with you, my readers, information that I have 
created? 

One of the strength of the Shannon entropy is its application of dissipative 
systems. Dissipative systems are different from systems in which the substance 
of the information distribution is conserved. This can further be elaborated: 
in the special case of an ideal collision the thermodynamic entropy vanishes, 
but the Shannon-type entropy (that is, the change in the distribution of energy 
and momenta) does not vanish, but tends to become maximal. 

Shannon entropy quantifies the information contained in a piece of data: it is 
the minimum average message length, in bits. Shannon information as the minimum 
number of bits needed to represent it is similar to the formulations of Chaitin 
information or Kolomogorov information. Shannon information has functionality 
for engineering purposes but since this is information without meaning it is 
better described as the measure of the amount and variety of the signal that is 
transmitted and not described as information. Shannon information theory is 
really signal transmission theory. Signal transmission is a necessary but not a 
sufficient condition for communications. There is no way to formulate the 
semantics, syntax or pragmatics of language within the Shannon framework. 

Agreed. One needs a special theory for specifying any substantive framework. 
However, the mathematical framework allows us to entertain developments in one 
substantive framework as heuristics in the other. Thus, we are able to move 
back and forth between frameworks using the formalizations. 

With best wishes, 

Loet

  _  

Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
 mailto:[EMAIL PROTECTED] [EMAIL PROTECTED] ;  http://www.leydesdorff.net/ 
http://www.leydesdorff.net/ 

 
Now available:  
http://www.universal-publishers.com/book.php?method=ISBNbook=1581129378 The 
Knowledge-Based Economy: Modeled, Measured, Simulated. 385 pp.; US$ 18.95 
 http://www.universal-publishers.com/book.php?method=ISBNbook=1581126956 The 
Self-Organization of the Knowledge-Based Society;  
http://www.universal-publishers.com/book.php?method=ISBNbook=1581126816 The 
Challenge of Scientometrics

 
 

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis