Re: [Fis] more thoughts about Shannon info

2007-11-07 Thread Stanley N. Salthe
Commenting first on Bob's and then on Karl's:

Bob said--

>Dear colleagues - please forgive my lapse in communications. I have been
>studying the question of Shannon info and have come up with the following
>thoughts I want to share with you. As always comments are solicited.
>Rather than answering each point raised in our recent email exchanges l
>decided to do some research and try to answer a number inquiries all at
>once.
>
>The inspiration for adopting the word entropy in information theory comes
>from the close resemblance between Shannon's formula and the very
>similar¬Ý formula from thermodynamics:¬ÝS = -k ’àë pi ln pi¬Ý.¬Ý ¬ÝEntropy
>is related to the second law of thermodynamics that states that:
 S: Boltzmann's physical entropy (S) is formally a refinement of
Shannon's informational entropy (H) - that is, {H {S}}.

>Energy spontaneously tends to flow only from being concentrated in one
>place ’Ä®to becoming diffused or dispersed and spread out. And energy is
>governed by the first law of thermodynamics that states that energy cannot
>be destroyed or created.
>
>
>Is there an equivalent 1st and¬Ý 2nd law for information?
 S: There is.  In an expanding system (such as the universe) H must
increase (as a kind of 'Second Law' for information) along with S, even as
information itself increases.  Papers on this were published by
cosmologists David Layzer and Steven Frautschi, and physicist P.T.
Landsberg.  As to whether there is a first law for information, ths is not
clear.   Universal expansion delivers new matter, and so it would seem to
deliver new informational constraints.

>Entropy is used to describe systems that are undergoing dynamic
>interactions like the molecules in a gas. What is the analogy with Shannon
>entropy or information?
 S: Informational entropy (H) inceases as new information enters the
system and old information 'mutates'.

>Is Shannon’Äôs formula really the basis for a theory of information or is
>it merely a theory of signal transmission?
 S: Of the three common definitions of information, it refers only to
information as a decrease in uncertainty (H).

>Thermodynamic entropy involves temperature and energy in the form of heat,
>which is constantly spreading out. Entropy S¬Ý is defined as ’àÜQ/T. What
>are the analogies for Shannon entropy?¬Ý
 S: Expansion, or growth, of a system generates new information, which
quickly modulates to informational entropy as uncertainty and contingency
creeps in.

>There is the flow of energy in thermodynamic entropy but energy is
>conserved, i.e. it cannot be destroyed or created.
  S: There may not be a First Law for information.  There may also not
be a very important role for the First Law of thermodynamics in natural
(local, nonequilibrium) systems.

>There is the flow of information in Shannon entropy but is information
>something that cannot be destroyed or created as is the case with energy?
>Is it conserved? I do not think so because when I share my information
>with you I do not lose information but you gain it and hence information
>is created. Are not these thoughts that I am sharing with you, my readers,
>information that I have created?
  S: Agreed. Information in a growing system continues to increase.  No
limit is known (although the pattern is likely mostly symptotic) -- unless
there could be some limit to the amount of uncertainty that a system can
bear.

>Shannon entropy quantifies the information contained in a piece of data:
>it is the minimum average message length, in bits. Shannon information as
>the minimum number of bits needed to represent it is similar to the
>formulations of Chaitin information or Kolomogorov information. Shannon
>information has functionality for engineering purposes but since this is
>information without meaning it is better described as the measure of the
>amount and variety of the signal that is transmitted and not described as
>information. Shannon information theory is really signal transmission
>theory. Signal transmission is a necessary but not a sufficient condition
>for communications. There is no way to formulate the semantics, syntax or
>pragmatics of language within the Shannon framework.
 S: What is missing is semiotics!

And here I reply to Karl
>Commenting upon Pedro's and Stan's:
>
>>Thus, I come back to meaning, helas, to do the same than the theoretical
>>physicist, but in the province of biology. The following 10 points could be
>>defended:
>>
>>1. Meaning is built molecularly, by the living cell.
> S: This is the position of the biosemiotics community (Semiotiics -
>the study of meaning construction). With a nod to Loet, the procedure is
>to begin with the most highly developed example of semiosis that we know of
>-- human discourse -- to derive the necessary categories (induction, etc.),
>which are then generalized in the spirit of systems science, so as to apply
>them to biosemiosis, and all the way to pansemiosis if we like.
>
>   K: We 

Re: [Fis] more thoughts about Shannon info

2007-11-07 Thread karl javorszky
The model proposed for numeric treatment of information answers following
points raised by Shannon and Logan:
Logan:

>  The inspiration for adopting the word entropy in information theory comes
> from the close resemblance between Shannon's formula and the very similar
> formula from thermodynamics: S = -k ∑ pi ln pi .   Entropy is related to
> the second law of thermodynamics that states that:
>
> Energy spontaneously tends to flow only from being concentrated in one
> place 
to becoming diffused or dispersed and spread out. And energy is
> governed by the first law of thermodynamics that states that energy cannot
> be destroyed or created.
>
> Is there an equivalent 1st and  2nd law for information?
>

Yes, there is. Information being the relation between the number of symbols
and their kind (properties, extent: in the numeric sense, extent, in the
logical sense: kind) one can propose following observation to e generalised
into a rule:
A closed system of symbols can be transformed into a differing closed system
of symbols while maintaining an identical informational content.
This means that the relation between the number of symbols and their kind
cannot be destroyed or created.

 Entropy is used to describe systems that are undergoing dynamic
> interactions like the molecules in a gas. What is the analogy with Shannon
> entropy or information?
>

In both cases, this is a LOCAL phaenomen taking place in a globally closed
system. While a part of the system cools down to a uniform level, a
different part of the system heats up (explodes, fuses, contracts, etc.). In
the numeric model, if the overall constant of information content of an
assembly remains the same (as it definitely does, assuming a finite *n*),
there may well be subsegments in the logical space which are more uniform
than other subsegments.
(Example: all true logical sentences that detail the relation between parts
and the whole with the whole being <137 is the closed universe. This set has
a given, constant, overall information content. It may however be very well
the case, that one specific subset has a deviating extent - locally - of its
own - local - information content.
Numeric explanation:
It may be, that 66 is with a prob of 90% in 10 .. 18 parts, but it may as
well be, that one of the cases describes a freakish assembly of far too many
1s as opposed to bigger summands. Any of the summands can be outside its
most probable range, and it is a certainity that one will observe a local
phaenomen of dissolving of summands into elementar units.)
This process happens in actual Nature surrounding us sufficiently often and
is sufficiently unusual so that humans notice and remember it and give it a
name. It appears that this process carries the name of "entropy".

 Is Shannon's formula really the basis for a theory of information or is it
> merely a theory of signal transmission?
>

No, Shannon's formula is not the basis of anything in information theory.
Shannons formula is the roof of an edifice based on the logic of
similarities. Information theory deals with a different basic concept. In
information theory the kind of a symbol is also a logical category of its
own and is not derrived from the number of symbols. (In classical -
similarity based - logic, the kind of a symbol derives from its number (the
number of elementar units that make up this whole). Information theory
negates the assumption that the parts actually and absolutely fuse into a
whole, an assumption which is depicted in the procedures currently in
exclusive use relating to the operation of addition.

 Thermodynamic entropy involves temperature* *and energy in the form of
> heat, which is constantly spreading out. Entropy S  is defined as ∆Q/T. What
> are the analogies for Shannon entropy?
>
> There is the flow of energy in thermodynamic entropy but energy is
> conserved, i.e. it cannot be destroyed or created.
>
> There is the flow of information in Shannon entropy but is information
> something that cannot be destroyed or created as is the case with energy? Is
> it conserved? I do not think so because when I share my information with you
> I do not lose information but you gain it and hence information is created.
> Are not these thoughts that I am sharing with you, my readers, information
> that I have created?
>
Globally:
In the case that humans DISCOVER the a-priori existing laws of Nature, your
communication does not transmit anything new, because the connections have
always been there, only we did not notice them afore.
In the case that humans CREATE mental images depicting a Nature that is - by
axiomatic reasons - not intelligible to humans, your communication says. "Is
it new for you that I can make myself understood?" and is a communication
for grammatical reasons, with no content.
 Locally:
Your communication reorders the concepts within the brain of the reader and
presumably changes some relations between the number of symbols and kinds of
symbols that were and are