Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-13 Thread tozziarturo

Dear Sung, 
I'm sorry, but the "Unreasonable Effectiveness of Mathematics" still holds 
true.  
Forget philosophical concepts like Yin and Yang, because, in some cases and 
contexts , entropy is negative.  
Just to make an example,
"Since the entropy H(S|O) can now become negative, erasing a system can result 
in a net gain of work (and a corresponding cooling of the environment)."
https://www.nature.com/nature/journal/v474/n7349/full/nature10123.html
--
Inviato da Libero Mail per Android venerdì, 13 ottobre 2017, 10:11PM +02:00 da 
Sungchul Ji  s...@pharmacy.rutgers.edu :

>Hi Arturo,
>
>( 1 )  I don't understand where you got (or how you can justify) S = 1 J/K in 
>your statement,
>
>" With the same probability mass function, you can see that H = S/(ln(2)*k B ),
> so setting S = 1J/K gives a Shannon entropy of 1.045×10 23  bits."
>
>( 2 ) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
>this equality mean physically
> ?
>( 3 ) This reminds me of what Schroedinger did when he came up with the 
>conclusion that "negative entropy" is
> equivalent to "order", which led to Brillouin's so-called the "negentropy 
> Principle of Information (NPI)" [1, 2].
>
>Simply by multiplying the both sides of the Boltzmann equation with negative 
>one, Schroedinger obtained the following formula:
>
> - S = - k lnW = k ln (1/W)
>
>and then equating W with disorder, D, led him to 
>
>- S = k ln (1/D).
>
>Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
>he concluded that
>
>"negative entropy = order".
>
>As you can see, the above derivation is mathematically sound but the result 
>violates the Third Law of Thermodynamics,
> according to which thermodynamic entropy cannot be less than zero.
>
>Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
>as follows [3]
>
>"Schroedinger's paradox refers to the mathematical equations, concepts, or 
>general statements that are formally true
> but physically meaningless." 
>
>( 4 ) If my argument in ( 3 ) is valid, this may provide an example of what 
>may be called 
>
>the " Unreasonable Ineffectiveness of Mathematics "
>
>which, together with Wigner's " Unreasonable Effectiveness of Mathematics ", 
>may constitute an Yin-Yang pair
> of mathematics.  
>
>All the best.
>
>Sung
>
>
>
> 
>
>
>
>
>
>References:
>   [1]   Brillouin, L. (1953).  Negentropy Principle of Information, J. 
>Applied Phys. 24 (9),
> 1152- 1163.
>   [2]  Brillouin, L. (1956).  Science and Information Theory, Academic Press, 
>Inc., New York, pp. 152-156.
>   [3] Ji, S. (2012).   The Third Law of  Thermodynamics  and  “Schroedinger’s
> Paradox” .  In: Molecular Theory of the Living
> Cell: Concepts, Molecular Mechanisms, and Biomedical Applications.   
> Springer, New York.  pp. 12-15.  
> PDF at  http://www.conformon.net/wp-content/uploads/2014/03/Schroedinger_pa 
> radox.pdf
>
> 
>
>
>
>
>--
>From: tozziart...@libero.it < tozziart...@libero.it >
>Sent: Friday, October 13, 2017 4:43 AM
>To: Sungchul Ji;  fis@listas.unizar.es
>Subject: R: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION
> 
>Dear Sung, 
>One J/K corresponds to 1.045×10 23  bits.
>
>Indeed, 
>The Gibbs entropy formula states that thermodynamic entropy S equals k B 
>*sum[p i *ln(1/p i )],
> with units of J/K, where k B  is
> the Boltzmann constant and p i  is
> the probability of microstate i. On the other hand, the Shannon entropy is 
> defined as H = sum[p i *log 2 (1/p i )],
> with units of bits. With the same probability mass function, you can see that 
> H = S/(ln(2)*k B ),
> so setting S = 1J/K gives a Shannon entropy of 1.045×10 23  bits.
>
>On the other side, The energy consumption per bit of data on  the Internet is 
>around 75 μJ at low access rates and decreases  to
> around 2-4 μJ at an access rate of 100 Mb/s.
>see: 
>http://www.ing.unitn.it/~fontana/GreenInternet/Recent%20Papers%20and%20p2p/Baliga_Ayre_Hinton_Sorin_Tucker_JLT0
> . pdf
>
>Futher,  according to Landauer's theory, a minimum amount of heat – roughly 10 
>–21  J
> per erased bit – must be dissipated when information is destroyed.
>http://physicsworld.com/cws/article/news/2012/mar/12/wiping-data-will-cost-you-energy
>
>
>In other words, summarizing, if you use the free energy to assess the 
>information, it works the same, giving a quantifiable value.  
>
>
>Arturo Tozzi
>AA Professor Physics, University North Texas
>Pediatrician ASL Na2Nord, Italy
>Comput Intell Lab, University Manitoba
>http://arturotozzi.webnode.it/  
>
>
>>Messaggio originale
>>Da: "Sungchul Ji" < s...@pharmacy.rutgers.edu >
>>Data: 12/10/2017 22.08
>>A: "Francesco Rizzo"< 13francesco.ri...@gmail.com >, "Pedro C. Marijuan"< 
>>pcmarijuan.i...@aragon.es >
>>Cc: "fis@listas.unizar.es >> fis@listas.unizar.es"< fis@listas.unizar.es >
>>Ogg: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION
>>
>>Hi FISers,
>>
>>The following statement canno

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-13 Thread Sungchul Ji
Hi Arturo,


(1)  I don't understand where you got (or how you can justify) S = 1 J/K in 
your statement,


"With the same probability mass function, you can see that H = S/(ln(2)*kB), so 
setting S = 1J/K gives a Shannon entropy of 1.045×1023 bits."


(2) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
this equality mean physically ?

(3) This reminds me of what Schroedinger did when he came up with the 
conclusion that "negative entropy" is equivalent to "order", which led to 
Brillouin's so-called the "negentropy Principle of Information (NPI)" [1, 2].


Simply by multiplying the both sides of the Boltzmann equation with negative 
one, Schroedinger obtained the following formula:


 - S = - k lnW = k ln (1/W)


and then equating W with disorder, D, led him to


- S = k ln (1/D).


Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
he concluded that


"negative entropy = order".


As you can see, the above derivation is mathematically sound but the result 
violates the Third Law of Thermodynamics, according to which thermodynamic 
entropy cannot be less than zero.


Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
as follows [3]


"Schroedinger's paradox refers to the mathematical equations, concepts, or 
general statements that are formally true but physically meaningless."


(4) If my argument in (3) is valid, this may provide an example of what may be 
called


the "Unreasonable Ineffectiveness of Mathematics"


which, together with Wigner's "Unreasonable Effectiveness of Mathematics", may 
constitute an Yin-Yang pair of mathematics.


All the best.


Sung











References:
   [1]  Brillouin, L. (1953).  Negentropy Principle of Information, J. Applied 
Phys. 24(9), 1152-1163.
   [2]  Brillouin, L. (1956). Science and Information Theory, Academic Press, 
Inc., New York, pp. 152-156.

   [3] Ji, S. (2012).  The Third Law of 
Thermodynamics and 
“Schroedinger’s Paradox”.  
In:Molecular Theory of the Living Cell: Concepts, Molecular Mechanisms, and 
Biomedical Applications.  Springer, New York.  pp. 12-15.  PDF at 
http://www.conformon.net/wp-content/uploads/2014/03/Schroedinger_paradox.pdf









From: tozziart...@libero.it 
Sent: Friday, October 13, 2017 4:43 AM
To: Sungchul Ji; fis@listas.unizar.es
Subject: R: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

Dear Sung,
One J/K corresponds to 1.045×1023 bits.

Indeed,
The Gibbs entropy formula states that thermodynamic entropy S equals 
kB*sum[pi*ln(1/pi)], with units of J/K, where kB is the Boltzmann constant and 
pi is the probability of microstate i. On the other hand, the Shannon entropy 
is defined as H = sum[pi*log2(1/pi)], with units of bits. With the same 
probability mass function, you can see that H = S/(ln(2)*kB), so setting S = 
1J/K gives a Shannon entropy of 1.045×1023 bits.

On the other side, The 
energy
 consumption per bit of data on the Internet is around 75 μJ at low access 
rates and decreases to around 2-4 μJ at an access rate of 100 Mb/s.
see:
http://www.ing.unitn.it/~fontana/GreenInternet/Recent%20Papers%20and%20p2p/Baliga_Ayre_Hinton_Sorin_Tucker_JLT0.pdf

Futher, according to Landauer's theory, a minimum amount of heat – roughly 
10–21 J per erased bit – must be dissipated when information is destroyed.
http://physicsworld.com/cws/article/news/2012/mar/12/wiping-data-will-cost-you-energy


In other words, summarizing, if you use the free energy to assess the 
information, it works the same, giving a quantifiable value.



Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/


Messaggio originale
Da: "Sungchul Ji" 
Data: 12/10/2017 22.08
A: "Francesco Rizzo"<13francesco.ri...@gmail.com>, "Pedro C. 
Marijuan"
Cc: "fis@listas.unizar.es >> fis@listas.unizar.es"
Ogg: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION


Hi FISers,


The following state

Re: [Fis] Data - Reflection - Information

2017-10-13 Thread Robert E. Ulanowicz
Dear Mark,

Thank you for your interest in my FIS paper!


I didn't intend by it to infer that Shannon-class measures were the
ultimate tool for information science, only to argue against prematurely
rejecting that thrust entirely -- as so many do. By looking at Bayesian
forms of the Shannon measure we can address information per-se (and even a
form of proto-meaning)and achieve a measure of what is missing. This
latter advantage opens up another dimension to science. (The apophatic had
been implicitly addressed by thermodynamic entropy, which has hardly ever
been recognized as an apophasis. That's why entropy remains so confusing
to so many!)

The Achilles tendon of Shannon-like measures lies in the underlying
assumption of distinct categories with which to describe the
distributions. The boundaries between categories are often "fuzzy", and,
as you point out, they change with time and growth.

I have been told that mutual information(s) has been defined over fuzzy
sets, but I confess I haven't investigated the advantages of this
extension. As for changing numbers of categories, I note that mutual
information remains well-defined even when the numbers of categories in
the sets being compared are not the same. So I would encourage your
exploration with musical forms.

As to Ashby's metaphor of a hemostat as a machine, my personal preference
is to restrict mechanical analogs for living systems to only those that
are unavoidable. I feel the language of mechanics and mechanisms is
*vastly* overused in biology and draws our attention away from the true
nature of biotic systems.

Thank you for your challenging and astute questions!

Cheers,
Bob

> Dear Bob,
>
> In your Shannon Exonerata paper you have an example of three strings,
> their entropies and their mutual information. I very much admire this
> paper and particularly the critique  of Shannon and the emphasis on the
> apophatic, but some things puzzle me. If these are strings of a living
> thing, then we can assume that these strings grow over time. If sequences
> A,B and C are related, then the growth of one is dependent on the growth
> of the other. This process occurs in time. During the growth of the
> strings, even the determination of what is and is not surprising changes
> with the distinction between what is seen to be the same and what isn't.
>
>  I have begun to think that it's the relative entropy between growing
> things (whether biological measurements, lines of musical counterpoint,
> learning) that matters. Particularly as mutual information is a variety
> of relative entropy. There are dynamics in the interactions. A change in
> entropy for one string with no change in entropy in the others (melody
> and accompaniment) is distinct from everything changing at the same time
> (that's "death and transfiguration"!).
>
> Shannon's formula isn't good at measuring change in entropy. It's less
> good with changes in distinctions which occur at critical moments ("aha! A
> discovery!" Or "this is no longer surprising") The best that we might do,
> I've thought, is segment your strings over time and examine relative
> entropies. I've done this with music. Does anyone have any other
> techniques?
>
> On the apophatic, I can imagine a study of the dynamics of Ashby's
> homeostat where each unit produced one of your strings. The machine comes
> to its solution when the entropies of the dials are each 0 (redundancy 1)
> As the machine approaches its equilibrium, the constraint of each dial on
> every other can be explored by the relative entropies between the dials.
> If we wanted the machine to keep on searching and not settle, it's
> conceivable that you might add more dials into the mechanism as its
> relative entropy started to approach 0. What would this do? It would
> maintain a counterpoint in the relative entropies within the ensemble.
> Would adding the dial increase the apophasis? Or the entropy? Or the
> relative entropy?
>
> Best wishes,
>
> Mark


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] R: Re: A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-13 Thread tozziart...@libero.it
Dear Sung, One J/K corresponds to 1.045×1023 bits.
Indeed, The Gibbs entropy formula states that thermodynamic entropy S equals 
kB*sum[pi*ln(1/pi)], with units of J/K, where kB is the Boltzmann constant and 
pi is the probability of microstate i. On the other hand, the Shannon entropy 
is defined as H = sum[pi*log2(1/pi)], with units of bits. With the same 
probability mass function, you can see that H = S/(ln(2)*kB), so setting S = 
1J/K gives a Shannon entropy of 1.045×1023 bits.
On the other side, The energy consumption per bit of data on the Internet is 
around 75 μJ at low access rates and decreases to around 2-4 μJ at an access 
rate of 100 Mb/s.
see: 
http://www.ing.unitn.it/~fontana/GreenInternet/Recent%20Papers%20and%20p2p/Baliga_Ayre_Hinton_Sorin_Tucker_JLT0.pdf

Futher, according to Landauer's theory, a minimum amount of heat – roughly 
10–21 J per erased bit – must be dissipated when information is 
destroyed.http://physicsworld.com/cws/article/news/2012/mar/12/wiping-data-will-cost-you-energy


In other words, summarizing, if you use the free energy to assess the 
information, it works the same, giving a quantifiable value.  

Arturo TozziAA Professor Physics, University North TexasPediatrician ASL 
Na2Nord, ItalyComput Intell Lab, University 
Manitobahttp://arturotozzi.webnode.it/ 





Messaggio originale

Da: "Sungchul Ji" 

Data: 12/10/2017 22.08

A: "Francesco Rizzo"<13francesco.ri...@gmail.com>, "Pedro C. 
Marijuan"

Cc: "fis@listas.unizar.es >> fis@listas.unizar.es"

Ogg: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION





-->



Hi FISers,





The following statement cannot be true.

"a proposal: information might stand for free energy."  


Fore one thing, the unit of information is bits and that of energy is cal or 
erg.




The proper relation between information and energy (including free energy) may 
be complementarity, just as is the relation between wave and particle.   
According to the ITR (Irreducible Triadic Relation) model of of signs and 
communication,
 information and energy are entangled in the sense that both are irreplaceably 
implicated in the process of communication. Both information and energy are  
needed for communication, the minimum energy cost of transmitting one bit of 
information being ~ 0.6
 Kcal/mole, according to Shannon.



All the best.



Sung



 






From: Fis  on behalf of Francesco Rizzo 
<13francesco.ri...@gmail.com>

Sent: Thursday, October 12, 2017 3:00 AM

To: Pedro C. Marijuan

Cc: fis@listas.unizar.es >> fis@listas.unizar.es

Subject: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION
 


Caro Pedro e cari tutti,
gli ingressi e le uscite delle cellule viventi con l'ambiente, non sono altro 
che materia, energia e informazione che entrano (INPUT) ed escono (OUTPUT)  
dando luogo al processo di TRAS-IN-FORM-AZIONE che ho elaborato nella Nuova 
Economia a proposito dei
 sistemi produttivi entropici (energia degradata o dis-informazione) e 
neg-entropici (energia libera o informazione) che hanno un carattere generale. 
Tanto è vero che circa 20 anni fa ho applicato e riferito alla cellula che 
stabilisce con l'ambiente (biologico-naturale)
 un rapporto simile a quello che l'intrapresa (azienda) stabilisce con 
l'ambiente (sociale-economico). In fondo la bio-chimica e l'economia risultano 
complementari nella vita degli uomini la cui esistenza e conoscenza possono ben 
comprendersi secondo la onto-logica
 empirica o concreta, altrimenti detta LIR, che la generosità di Joseph Brenner 
ha intravisto anche nella mia analisi scientifica. Purtroppo  questa 
problematica, ben espressa e sintetizzata dal processo di TRAS-IN-FORM-AZIONE e 
più volte oggetto di confronto
 e discussione nel dibattito Fis, è poco conosciuta perchè si ritrova esposta 
in una ventina dei miei libri scritti in italiano.
Comunque il TEMPO è (sempre galantuomo e fornisce) l'INFORMAZIONE giusta 
svolgendo la funzione della LINGUA delle LINGUE che tutti possono com-prendere, 
prima o poi. Grazie, per l'opportunità che mi date a partire da Pedro che ha il 
grande merito dell'iniziazione-mediazione
 in tal senso.
Un abbraccio, Francesco Rizzo.






2017-10-11 14:30 GMT+02:00 Pedro C. Marijuan 
:



Dear Arturo and colleagues,



I think that relating information to free energy can be a good idea. I am not 
sure whether the expressions derived from Gibbs free energy (below) have 
sufficient generality; at least they work very well for chemical reactions. And 
it is in the biomolecular
 (chemical) realm where the big divide between "animate information" and 
"inanimate information" occurs. In that sense, I include herein the scheme we 
have just published of prokaryotic cells in their management of the 
"information flow". In a next message
 I will make suggestions on how the mapping of biological information may 
conduce to a more general approach that includes the other varieties of 
information (anthropocentric, physical, chemical, cosmological, et