RE: [Fis] more thoughts about Shannon info

2007-11-19 Thread Loet Leydesdorff
> > JC: This is true. However any theory that is not consistent
> > with physics is in
> > LaLa Land as far as I am concerned. If you have a good
> > argument why this is not reasonable, I would like to know.
>
> I suppose that most of the social sciences are not inconsistent with
> physics, but also not so relevant for it. Entropy calculus
> can be used in
> much broader range of sciences because of its mathematical character.
>
> I am sure that on the stock exchange, stocks are physically or
> electronically exchanged. However, the value of the stocks
> has nothing to do with these physical carriers.

However the information they carry does include information
about things other the "physical carriers", or else there wouldn't be much
point
in trading them. If the connections don't line up the right way to
fit the physical parametres, such as resources, waste, consumption,
etc., then something will go wrong, in much the same way as it will
go wrong if our representations do not correspond to the world. There
has to be a match between the encoding and what is encoded, or
anticipations will fail, eventually. At least that is what happens in the
sort of biological system that I look at. For example the genetic code
is fairly arbitrary, but unless it codes not just for aspects
of phenotypic expression but also aspects of the environment (not to
mention internal workings and processes of the organism), then maladaptation
will
occur. A completely free floating level would be irrelevant to anything
else. Interesting perhaps, but pretty useless.

John

Dear John: 

Since this is a new week, let me assure you that I was not talking about
"freely floating" angels sitting on the tip of a needle, but levels of
(self-)organization other than physics which are indeed constrained and
enabled by their physical conditions. The dynamics of these systems are not
in LaLa Land, but for example, studied in the biological and social
sciences. 

In a formal sense the physical determination is limited to the mutual
information between the physical world and the self-organizing dynamics of
the emerging systems. The relevance of the mutual information (transmission)
can be rather limited, for example, in meaning-processing systems. I was
just objecting to the (perhaps erroneous) impression that you had converted
to reductionism with the expression of discarding all other systems as LaLa
Land. The formalisms of entropy statistics are not constrained by their
physical applications (except of course that one of us has to develop and
communicate them).

With best wishes,


Loet

  _  

Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
  [EMAIL PROTECTED] ;
 http://www.leydesdorff.net/ 

 
Now available:

The Knowledge-Based Economy: Modeled, Measured, Simulated. 385 pp.; US$
18.95 
 
The Self-Organization of the Knowledge-Based Society;

The Challenge of Scientometrics

 
 

 

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


RE: [Fis] more thoughts about Shannon info

2007-11-16 Thread Stanley N. Salthe
Repsonding to John --

> At 10:01 PM 2007/11/14, Stanley N. Salthe wrote:
>Replying to Loet, Rafael
> Loet said:
>
> >LL: The analogy with the Shannon entropy is strictly formal.  Shannon's is a
> >mathematical theory; bits of information are dimensionless.
> >The Boltzman-constant (k(B)) provides dimensionality to S.
>
> JC: Oh, not this canard again. S has units of energy over temperature.
> deltaQ/T.  T is energy per degree of freedom. K/M, where M is multiplicity.
> So S units are proportional to E units, and inversely to E units proportioned
> to M.  I am ignoring anything but the dimensionality here. So entropy has
>units of
> degrees of freedom. As I pointed out in some detail in a previous post, this
> is not an unreasonable unit for information.
  S: Degrees of freedom is appropriate for entropy taken as
disorder/uncertainty.  As a diminution of uncertainty, information could
then also be in degrees of freedom, but it would be nice to have in it a
measure of how much this is a decrease from before we had the information.
I = Hmax - Hactual. (Alas, Hmax is seldom available in natural sysems.)

> JC: The two concepts are commensurate in units. Sometimes entropy is
> described in terms of dimensionless entropy units by knowledgeable
> physicists, as the measure has no dimension, but it is useful to identify
> the application to physical cases. However the same measure can be
> used for any case as long as we keep macrostates and microsteates
> properly defined. See my papers:
> Entropy in
>Evolution (1986)
> 
>http://www.ukzn.ac.za/undphil/collier/papers/entev.pdf
> and
> Hierarchical Dynamical
>Information Systems With a Focus on Biology (Entropy 2003)
> 
>http://www.mdpi.org/entropy/papers/e5020100.pdf
> for details that dot the i's and cross the t's.
>
>>LL: Thermodynamic entropy can be considered as a special case of Shannon
>>entropy,
>>from this perspective. Thermodynamics can thus be considered as a special
>>case of
> >non-linear dynamics from this perspective. One needs physics as a
>special theory
>> for its specification.
>
> JC: This is true. However any theory that is not consistent with physics
>is in
> LaLa Land as far as I am concerned. If you have a good argument why this
> is not reasonable, I would like to know.
> It should be noted, however, that Shannon information
 S: I presume Shannon information means H, information carrying capacity.

> can increase with
> a passive filter. In fact this is the case unless it is a measure of the
> information in the microstate of a system, which is generally not
> accessible to direct measurement or manipulation. What we can
> directly manipulate and measure is the complement of this information,
> and it is typically called negentropy. We can define the negentropy of
> a system rigorously as the difference between its actual entropy (which
> depends on M above, which depends on which degrees of freedom
> are accessible) and the entropy the system would have if all constraints
> internal to the system are relaxed (it is a theorem of classical
> thermodynamics without any assumption of equilibrium that this
> is unique -- the theorem can be used as the central postulate of
> thermodynamics -- see Carathéodory, also Kestin).
 S: So, again:  I (negentropy) = Hmax - Hactual

>JC: We have to
> have negentropy for Shannon information, if we are to use the
> information to communicate or to control things. Personally, I
> think that Shannon was talking only about this, and to call the
> more general property Shannon entropy is very misleading. It
> can't be false, because definitions are not true or false.
 S: Well, I think it deplorable to call information carrying capacity
(H) Shannon information (Hmax-Hactual) (if that is what is being done here).

>JC:  So if we get down to details, Shannon theory and statistical mechanics
> are really parts of the same, more general theory. I prefer to call it
> statistical mechanics, or statistical dynamics, or something like that,
> as it applies to all dynamical systems pretty much a priori. The
> empirical part is in identifying the dynamical parametres properly,
> and the subsystems that are stable enough to treat as objects
> to be quantified over in the formal representation of the theory.
>
> AGAIN >>LL: Thermodynamic entropy can be considered as a special case of
>Shannon entropy,
>>from this perspective. Thermodynamics can thus be considered as a special
>>case of
> >non-linear dynamics from this perspective. One needs physics as a
>special theory
>> for its specification.
> S: Agreed.
>
>JC:  You shouldn't agree to that, Stan. It will only confuse issues if
> you keep the two notions conceptually separate. In fact you
> do that in the next passage.
>
> S: There is, however, a

RE: [Fis] more thoughts about Shannon info

2007-11-14 Thread Loet Leydesdorff
>  S: Agreed.  There is, however, an interesting further 
> recent viewpoint
> in physics (e.g., Dewar, R.C., 2005.  Maximum entropy 
> production and the
> fluctuation theorem.  J. Phys. A, Math. & General 38: 
> L371-L381), which
> pulls together the Shannon type entropy (variety) and physical entropy
> production.  The idea here is referred to as the maimum 
> entropy production
> principle (MEP).  Dewar has shown that a system that can assume many
> different conformations will tend to tend to take up one that 
> maximizes its
> entropy production.  Thus, maximum entropy (H) (MaxEnt) facilitates
> maximizing entropy (S) production (MEP).  And so, the 
> connection is that if
> a system has greater behavioral entropy (H), it will better be able to
> further increase its entropy production.  So, not only is S a 
> refinement of
> H -- {H {S}} -- it will also be produced more by a system with larger
> behavioral H.

Dear Stan, 

An interesting consequence from my perspective (anticipatory systems and
more generally, systems which provide meaning to models of themselves and
their environment) would be that the production of negative (Shannon)
entropy would also limit the entropy production flux. 

Indeed, this follows from the algorithms: these systems function as filters.
Reflexivity suppresses chaotic development and makes it thus possible to
process more complexity. 

With best wishes, 


Loet


Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
[EMAIL PROTECTED] ; http://www.leydesdorff.net/ 
 
Now available: The Knowledge-Based Economy: Modeled, Measured, Simulated.
385 pp.; US$ 18.95 

 
 

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


RE: [Fis] more thoughts about Shannon info

2007-11-14 Thread Stanley N. Salthe
Replying to Loet, Rafael

Loet said:

>The analogy with the Shannon entropy is strictly formal.  Shannon's is a
>mathematical theory; bits of information are dimensionless. The
>Boltzman-constant (k(B)) provides dimensionality to S. Thermodynamic
>entropy  can be considered as a special case of Shannon entropy, from this
>perspective. Thermodynamics can thus be considered as a special case of
>non-linear dynamics from this perspective. One needs physics as a special
>theory  for its specification. 
 S: Agreed.  There is, however, an interesting further recent viewpoint
in physics (e.g., Dewar, R.C., 2005.  Maximum entropy production and the
fluctuation theorem.  J. Phys. A, Math. & General 38: L371-L381), which
pulls together the Shannon type entropy (variety) and physical entropy
production.  The idea here is referred to as the maimum entropy production
principle (MEP).  Dewar has shown that a system that can assume many
different conformations will tend to tend to take up one that maximizes its
entropy production.  Thus, maximum entropy (H) (MaxEnt) facilitates
maximizing entropy (S) production (MEP).  And so, the connection is that if
a system has greater behavioral entropy (H), it will better be able to
further increase its entropy production.  So, not only is S a refinement of
H -- {H {S}} -- it will also be produced more by a system with larger
behavioral H.

Replying to Rafael --

>Stanley
>
>I think this is less a question of "grand theories" than of persons and
>interests. In this list the natural scientists are in the majority (?)
>and they have their special interests and "blind spots" (like everybody
>too).
>
>When the discussion turns to other themes as, say, atoms and cells
>into, say, culture or economics or a specific historical event or a
>piece of art or..., then there is (almost) no more interest in a
>"discussion" and the possibility of explaining (also in a Peircean
>framework) say Shakespeare's "Machbeth" out of the interaction
>of neurons becomes absurd not just because such an "explanation"
>would never explain what "Machbeth" is all about but also because
>the endless chain of causes and effects could never be discovered
>and if it is discovered it does not reach "Macbeth" but "just" the
>process leading to it.
  S: I agree with Rafael here.  And it goes beyond 'Macbeth'!  Any
actual occasion will be largely historically determined -- the writing of a
play, a particular performance of a play, and so on -- including a
thunderstorm, or the death of a last member of a species.  Science deals
only in repeatable observables, with what occasions have in common.  The
informational constraints bearing upon any event will all have been
historically constituted.  Thus, take Y = aXpower b.  The relationship here
may have been scientifically determined, and found to be robust from one
instance to another, but the actual values of the informational
constraints, a and b, are unique, and were the results of historical
contingencies.  Science cannot 'understand' them, only discover and use
them.  The subject matter of much of humanities studies concern events such
as those that led to setting such values, or to examining the settings in
detail, all of which goes beyomd the interest and competence of science.

>This is not a plea for "defaitism" but just trying not to lie ourselves
>when we speak about "interdisciplinarity" and the like. Such a
>dialogue is only possible if there is a common phenomenon we
>can address from different perspectives and...a common interest
>in doing this.
 S: Consider the equation above.  Science would try to understand why X
relates to Y in a certain way, humanities might try to understand why the
informational values of a and b are exactly as they are.

>The case of "information" is clearly only apparently
>such a common phenomenon. Why? because information is not
>a thing or a property of things but a second order category (agree with
>Peirce).
>The best we can achieve in this regard is to compare (in some way)
>information (and communication) phenomena in, say, the cell
>or in society. But such a comparison is probably in many cases
>not very attractive to persons interested only or mainly in studying
>one phenomenon and not the other. Most of the time the discussions
>are frustrating for both sides.
  S: Ah, yes.  You have pointed to a reason why (in the US at least)
disciplines such as Systems Science and Semiotics have had little success
in Academia.  They are transdisciplinary or metadisciplinary.  They are
extremely abstract.  This appeals evidently to few minds, and, I susect, it
has not et been imagined how to USE them to further economic growth.

STAN

STAN



___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


RE: [Fis] more thoughts about Shannon info

2007-11-11 Thread Loet Leydesdorff
PS. I realized later during the day that the formal equivalence of the 
thermodynamic and probabilistic entropies follows directly from the formula: ΔS 
= k(B) * ΔH. Since k(B) is a constant, the dynamic properties of S have to be 
the same as the dynamic properties of H. The dynamics in S cannot find their 
origin in the dynamics of a constant. The constant provides only the 
dimensionality of the thermodynamic entropy. The probabilistic entropy is the 
general case (although historically later). 
 
In his book, Henry Theil additionally provides the proof that ΔH is always 
positive by taking the derivative of the formula H = - Σ p(i) log p(i). This 
can be considered as the probabilistic equivalent of the second law. Under 
specifiable conditions negative (probabilistic) entropies can occur, for 
example, in the case of the mutual information in three dimensions (Ulanowicz). 
However, the Boltzmann constant is irrelevant for the derivation. 
 
Best wishes,  Loet
 
 
 
Dear Bob and colleagues:

Energy spontaneously tends to flow only from being concentrated in one place 

to becoming diffused or dispersed and spread out. And energy is governed by the 
first law of thermodynamics that states that energy cannot be destroyed or 
created.

Is there an equivalent 1st and  2nd law for information? 

Yes, there is. The proof of the non-negativity of the information expectation 
can be found at pp. 59f. of Henry Theil, Statistical Decomposition Analysis. 
Amsterdam: North-Holland, 1972. 

Entropy is used to describe systems that are undergoing dynamic interactions 
like the molecules in a gas. What is the analogy with Shannon entropy or 
information?

Is Shannon’s formula really the basis for a theory of information or is it 
merely a theory of signal transmission? 

The issue is what you mean with "really": historically, it was only a theory of 
signal transmission. However, it can be further elaborated into a theory of 
information.  

Thermodynamic entropy involves temperature and energy in the form of heat, 
which is constantly spreading out. Entropy S  is defined as ∆Q/T. What are the 
analogies for Shannon entropy?  

The analogy with the Shannon entropy is strictly formal. Shannon's is a 
mathematical theory; bits of information are dimensionless. The 
Boltzman-constant (k(B)) provides dimensionality to S. Thermodynamic entropy 
can be considered as a special case of Shannon entropy, from this perspective. 
Thermodynamics can thus be considered as a special case of non-linear dynamics 
from this perspective. One needs physics as a special theory for its 
specification. 

There is the flow of energy in thermodynamic entropy but energy is conserved, 
i.e. it cannot be destroyed or created.

There is the flow of information in Shannon entropy but is information 
something that cannot be destroyed or created as is the case with energy? Is it 
conserved? I do not think so because when I share my information with you I do 
not lose information but you gain it and hence information is created. Are not 
these thoughts that I am sharing with you, my readers, information that I have 
created? 

One of the strength of the Shannon entropy is its application of dissipative 
systems. Dissipative systems are different from systems in which the substance 
of the information distribution is conserved. This can further be elaborated: 
in the special case of an ideal collision the thermodynamic entropy vanishes, 
but the Shannon-type entropy (that is, the change in the distribution of energy 
and momenta) does not vanish, but tends to become maximal. 

Shannon entropy quantifies the information contained in a piece of data: it is 
the minimum average message length, in bits. Shannon information as the minimum 
number of bits needed to represent it is similar to the formulations of Chaitin 
information or Kolomogorov information. Shannon information has functionality 
for engineering purposes but since this is information without meaning it is 
better described as the measure of the amount and variety of the signal that is 
transmitted and not described as information. Shannon information theory is 
really signal transmission theory. Signal transmission is a necessary but not a 
sufficient condition for communications. There is no way to formulate the 
semantics, syntax or pragmatics of language within the Shannon framework. 

Agreed. One needs a special theory for specifying any substantive framework. 
However, the mathematical framework allows us to entertain developments in one 
substantive framework as heuristics in the other. Thus, we are able to move 
back and forth between frameworks using the formalizations. 

With best wishes, 

Loet

  _  

Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
  [EMAIL PROTECTED] ;   
http://www.leydesdorff.net/ 

 
Now ava

RE: [Fis] more thoughts about Shannon info

2007-11-10 Thread Loet Leydesdorff
Dear Bob and colleagues:

Energy spontaneously tends to flow only from being concentrated in one place 

to becoming diffused or dispersed and spread out. And energy is governed by the 
first law of thermodynamics that states that energy cannot be destroyed or 
created.

Is there an equivalent 1st and  2nd law for information? 

Yes, there is. The proof of the non-negativity of the information expectation 
can be found at pp. 59f. of Henry Theil, Statistical Decomposition Analysis. 
Amsterdam: North-Holland, 1972. 

Entropy is used to describe systems that are undergoing dynamic interactions 
like the molecules in a gas. What is the analogy with Shannon entropy or 
information?

Is Shannon’s formula really the basis for a theory of information or is it 
merely a theory of signal transmission? 

The issue is what you mean with "really": historically, it was only a theory of 
signal transmission. However, it can be further elaborated into a theory of 
information.  

Thermodynamic entropy involves temperature and energy in the form of heat, 
which is constantly spreading out. Entropy S  is defined as ∆Q/T. What are the 
analogies for Shannon entropy?  

The analogy with the Shannon entropy is strictly formal. Shannon's is a 
mathematical theory; bits of information are dimensionless. The 
Boltzman-constant (k(B)) provides dimensionality to S. Thermodynamic entropy 
can be considered as a special case of Shannon entropy, from this perspective. 
Thermodynamics can thus be considered as a special case of non-linear dynamics 
from this perspective. One needs physics as a special theory for its 
specification. 

There is the flow of energy in thermodynamic entropy but energy is conserved, 
i.e. it cannot be destroyed or created.

There is the flow of information in Shannon entropy but is information 
something that cannot be destroyed or created as is the case with energy? Is it 
conserved? I do not think so because when I share my information with you I do 
not lose information but you gain it and hence information is created. Are not 
these thoughts that I am sharing with you, my readers, information that I have 
created? 

One of the strength of the Shannon entropy is its application of dissipative 
systems. Dissipative systems are different from systems in which the substance 
of the information distribution is conserved. This can further be elaborated: 
in the special case of an ideal collision the thermodynamic entropy vanishes, 
but the Shannon-type entropy (that is, the change in the distribution of energy 
and momenta) does not vanish, but tends to become maximal. 

Shannon entropy quantifies the information contained in a piece of data: it is 
the minimum average message length, in bits. Shannon information as the minimum 
number of bits needed to represent it is similar to the formulations of Chaitin 
information or Kolomogorov information. Shannon information has functionality 
for engineering purposes but since this is information without meaning it is 
better described as the measure of the amount and variety of the signal that is 
transmitted and not described as information. Shannon information theory is 
really signal transmission theory. Signal transmission is a necessary but not a 
sufficient condition for communications. There is no way to formulate the 
semantics, syntax or pragmatics of language within the Shannon framework. 

Agreed. One needs a special theory for specifying any substantive framework. 
However, the mathematical framework allows us to entertain developments in one 
substantive framework as heuristics in the other. Thus, we are able to move 
back and forth between frameworks using the formalizations. 

With best wishes, 

Loet

  _  

Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681 
  [EMAIL PROTECTED] ;   
http://www.leydesdorff.net/ 

 
Now available:  
 The 
Knowledge-Based Economy: Modeled, Measured, Simulated. 385 pp.; US$ 18.95 
  The 
Self-Organization of the Knowledge-Based Society;  
 The 
Challenge of Scientometrics

 
 

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] more thoughts about Shannon info

2007-11-07 Thread Stanley N. Salthe
Commenting first on Bob's and then on Karl's:

Bob said--

>Dear colleagues - please forgive my lapse in communications. I have been
>studying the question of Shannon info and have come up with the following
>thoughts I want to share with you. As always comments are solicited.
>Rather than answering each point raised in our recent email exchanges l
>decided to do some research and try to answer a number inquiries all at
>once.
>
>The inspiration for adopting the word entropy in information theory comes
>from the close resemblance between Shannon's formula and the very
>similar¬Ý formula from thermodynamics:¬ÝS = -k ’àë pi ln pi¬Ý.¬Ý ¬ÝEntropy
>is related to the second law of thermodynamics that states that:
 S: Boltzmann's physical entropy (S) is formally a refinement of
Shannon's informational entropy (H) - that is, {H {S}}.

>Energy spontaneously tends to flow only from being concentrated in one
>place ’Ä®to becoming diffused or dispersed and spread out. And energy is
>governed by the first law of thermodynamics that states that energy cannot
>be destroyed or created.
>
>
>Is there an equivalent 1st and¬Ý 2nd law for information?
 S: There is.  In an expanding system (such as the universe) H must
increase (as a kind of 'Second Law' for information) along with S, even as
information itself increases.  Papers on this were published by
cosmologists David Layzer and Steven Frautschi, and physicist P.T.
Landsberg.  As to whether there is a first law for information, ths is not
clear.   Universal expansion delivers new matter, and so it would seem to
deliver new informational constraints.

>Entropy is used to describe systems that are undergoing dynamic
>interactions like the molecules in a gas. What is the analogy with Shannon
>entropy or information?
 S: Informational entropy (H) inceases as new information enters the
system and old information 'mutates'.

>Is Shannon’Äôs formula really the basis for a theory of information or is
>it merely a theory of signal transmission?
 S: Of the three common definitions of information, it refers only to
information as a decrease in uncertainty (H).

>Thermodynamic entropy involves temperature and energy in the form of heat,
>which is constantly spreading out. Entropy S¬Ý is defined as ’àÜQ/T. What
>are the analogies for Shannon entropy?¬Ý
 S: Expansion, or growth, of a system generates new information, which
quickly modulates to informational entropy as uncertainty and contingency
creeps in.

>There is the flow of energy in thermodynamic entropy but energy is
>conserved, i.e. it cannot be destroyed or created.
  S: There may not be a First Law for information.  There may also not
be a very important role for the First Law of thermodynamics in natural
(local, nonequilibrium) systems.

>There is the flow of information in Shannon entropy but is information
>something that cannot be destroyed or created as is the case with energy?
>Is it conserved? I do not think so because when I share my information
>with you I do not lose information but you gain it and hence information
>is created. Are not these thoughts that I am sharing with you, my readers,
>information that I have created?
  S: Agreed. Information in a growing system continues to increase.  No
limit is known (although the pattern is likely mostly symptotic) -- unless
there could be some limit to the amount of uncertainty that a system can
bear.

>Shannon entropy quantifies the information contained in a piece of data:
>it is the minimum average message length, in bits. Shannon information as
>the minimum number of bits needed to represent it is similar to the
>formulations of Chaitin information or Kolomogorov information. Shannon
>information has functionality for engineering purposes but since this is
>information without meaning it is better described as the measure of the
>amount and variety of the signal that is transmitted and not described as
>information. Shannon information theory is really signal transmission
>theory. Signal transmission is a necessary but not a sufficient condition
>for communications. There is no way to formulate the semantics, syntax or
>pragmatics of language within the Shannon framework.
 S: What is missing is semiotics!

And here I reply to Karl
>Commenting upon Pedro's and Stan's:
>
>>Thus, I come back to meaning, helas, to do the same than the theoretical
>>physicist, but in the province of biology. The following 10 points could be
>>defended:
>>
>>1. Meaning is built molecularly, by the living cell.
> S: This is the position of the biosemiotics community (Semiotiics -
>the study of meaning construction). With a nod to Loet, the procedure is
>to begin with the most highly developed example of semiosis that we know of
>-- human discourse -- to derive the necessary categories (induction, etc.),
>which are then generalized in the spirit of systems science, so as to apply
>them to biosemiosis, and all the way to pansemiosis if we like.
>
>   K: We 

Re: [Fis] more thoughts about Shannon info

2007-11-07 Thread karl javorszky
The model proposed for numeric treatment of information answers following
points raised by Shannon and Logan:
Logan:

>  The inspiration for adopting the word entropy in information theory comes
> from the close resemblance between Shannon's formula and the very similar
> formula from thermodynamics: S = -k ∑ pi ln pi .   Entropy is related to
> the second law of thermodynamics that states that:
>
> Energy spontaneously tends to flow only from being concentrated in one
> place 
to becoming diffused or dispersed and spread out. And energy is
> governed by the first law of thermodynamics that states that energy cannot
> be destroyed or created.
>
> Is there an equivalent 1st and  2nd law for information?
>

Yes, there is. Information being the relation between the number of symbols
and their kind (properties, extent: in the numeric sense, extent, in the
logical sense: kind) one can propose following observation to e generalised
into a rule:
A closed system of symbols can be transformed into a differing closed system
of symbols while maintaining an identical informational content.
This means that the relation between the number of symbols and their kind
cannot be destroyed or created.

 Entropy is used to describe systems that are undergoing dynamic
> interactions like the molecules in a gas. What is the analogy with Shannon
> entropy or information?
>

In both cases, this is a LOCAL phaenomen taking place in a globally closed
system. While a part of the system cools down to a uniform level, a
different part of the system heats up (explodes, fuses, contracts, etc.). In
the numeric model, if the overall constant of information content of an
assembly remains the same (as it definitely does, assuming a finite *n*),
there may well be subsegments in the logical space which are more uniform
than other subsegments.
(Example: all true logical sentences that detail the relation between parts
and the whole with the whole being <137 is the closed universe. This set has
a given, constant, overall information content. It may however be very well
the case, that one specific subset has a deviating extent - locally - of its
own - local - information content.
Numeric explanation:
It may be, that 66 is with a prob of 90% in 10 .. 18 parts, but it may as
well be, that one of the cases describes a freakish assembly of far too many
1s as opposed to bigger summands. Any of the summands can be outside its
most probable range, and it is a certainity that one will observe a local
phaenomen of dissolving of summands into elementar units.)
This process happens in actual Nature surrounding us sufficiently often and
is sufficiently unusual so that humans notice and remember it and give it a
name. It appears that this process carries the name of "entropy".

 Is Shannon's formula really the basis for a theory of information or is it
> merely a theory of signal transmission?
>

No, Shannon's formula is not the basis of anything in information theory.
Shannons formula is the roof of an edifice based on the logic of
similarities. Information theory deals with a different basic concept. In
information theory the kind of a symbol is also a logical category of its
own and is not derrived from the number of symbols. (In classical -
similarity based - logic, the kind of a symbol derives from its number (the
number of elementar units that make up this whole). Information theory
negates the assumption that the parts actually and absolutely fuse into a
whole, an assumption which is depicted in the procedures currently in
exclusive use relating to the operation of addition.

 Thermodynamic entropy involves temperature* *and energy in the form of
> heat, which is constantly spreading out. Entropy S  is defined as ∆Q/T. What
> are the analogies for Shannon entropy?
>
> There is the flow of energy in thermodynamic entropy but energy is
> conserved, i.e. it cannot be destroyed or created.
>
> There is the flow of information in Shannon entropy but is information
> something that cannot be destroyed or created as is the case with energy? Is
> it conserved? I do not think so because when I share my information with you
> I do not lose information but you gain it and hence information is created.
> Are not these thoughts that I am sharing with you, my readers, information
> that I have created?
>
Globally:
In the case that humans DISCOVER the a-priori existing laws of Nature, your
communication does not transmit anything new, because the connections have
always been there, only we did not notice them afore.
In the case that humans CREATE mental images depicting a Nature that is - by
axiomatic reasons - not intelligible to humans, your communication says. "Is
it new for you that I can make myself understood?" and is a communication
for grammatical reasons, with no content.
 Locally:
Your communication reorders the concepts within the brain of the reader and
presumably changes some relations between the number of symbols and kinds of
symbols that were and are 

[Fis] more thoughts about Shannon info

2007-11-06 Thread bob logan
Dear colleagues - please forgive my lapse in communications. I have  
been studying the question of Shannon info and have come up with the  
following thoughts I want to share with you. As always comments are  
solicited. Rather than answering each point raised in our recent  
email exchanges l decided to do some research and try to answer a  
number inquiries all at once.


The inspiration for adopting the word entropy in information theory  
comes from the close resemblance between Shannon's formula and the  
very similar  formula from thermodynamics: S = -k ∑ pi ln pi .
Entropy is related to the second law of thermodynamics that states that:


Energy spontaneously tends to flow only from being concentrated in  
one place
to becoming diffused or dispersed and spread out. And energy is  
governed by the first law of thermodynamics that states that energy  
cannot be destroyed or created.


Is there an equivalent 1st and  2nd law for information?

Entropy is used to describe systems that are undergoing dynamic  
interactions like the molecules in a gas. What is the analogy with  
Shannon entropy or information?


Is Shannon’s formula really the basis for a theory of information or  
is it merely a theory of signal transmission?


Thermodynamic entropy involves temperature and energy in the form of  
heat, which is constantly spreading out. Entropy S  is defined as ∆Q/ 
T. What are the analogies for Shannon entropy?


There is the flow of energy in thermodynamic entropy but energy is  
conserved, i.e. it cannot be destroyed or created.


There is the flow of information in Shannon entropy but is  
information something that cannot be destroyed or created as is the  
case with energy? Is it conserved? I do not think so because when I  
share my information with you I do not lose information but you gain  
it and hence information is created. Are not these thoughts that I am  
sharing with you, my readers, information that I have created?


Shannon entropy quantifies the information contained in a piece of  
data: it is the minimum average message length, in bits. Shannon  
information as the minimum number of bits needed to represent it is  
similar to the formulations of Chaitin information or Kolomogorov  
information. Shannon information has functionality for engineering  
purposes but since this is information without meaning it is better  
described as the measure of the amount and variety of the signal that  
is transmitted and not described as information. Shannon information  
theory is really signal transmission theory. Signal transmission is a  
necessary but not a sufficient condition for communications. There is  
no way to formulate the semantics, syntax or pragmatics of language  
within the Shannon framework.___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis