Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-14 Thread Bob Logan
Hello Sung and Arturo. Entropy is a measure of disorder and ΔS > 0. If entropy 
is zero at T = 0 K because there is no disorder at T = absolute zero then 
entropy can only increase from T = 0 K. If that is the case how can entropy 
ever be negative?

Arturo asked me to share a private email I sent to him about the elephant and 
the 3 blind men. He urged me to share it with the group. Because we are limited 
to 2 posts per week I waited until I had something else to post. So here by 
Arturo’s request is our correspondence:

Dear Bob, 
A nice story!
I think you must share it with FISers.

...even if I think that, while one blind man is touching the elephant, another 
is touching a lion, and another a deer...
In other words, your tale says that a single elephant does exist, while I'm not 
so sure...
However, don't worry, I will not make a public critic to your nice tale!

--
Inviato da Libero Mail per Android

venerdì, 06 ottobre 2017, 02:15PM +02:00 da Bob Logan lo...@physics.utoronto.ca 
:

Caro Arturo - and thanks for your feedback. 

The discussion of info on the FIS list (re what is info) is like the 3 blind 
men inspecting an elephant. It is a rope said the blind man holding the tail; 
no It is a snake said the blind man holding the trunk; no It is a tree said the 
blind man touching the elephant’s leg.

I refrained from using this story on the FIS list as it might come off as 
insulting - what do you think - should I share it with the group 

tanti auguri - Bob 


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.researchgate.net/profile/Robert_Logan5/publications
https://www.physics.utoronto.ca/people/homepages/logan


On Oct 14, 2017, at 9:25 PM, Sungchul Ji  wrote:

Hi Arturo,

I agree.  Engtropy can be negative MATHEMATICALLY, as Shroedinger assumed.
But what I am claiming is that that may be a mathematical artifact, since, 
according to the Third Law of Thermodynamics, therer is no negative entropy.  

All the best.

Sung


From: tozziart...@libero.it 
Sent: Friday, October 13, 2017 6:02 PM
To: Sungchul Ji
Cc: fis@listas.unizar.es
Subject: Re[2]: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION
 
Dear Sung, 
I'm sorry, but the "Unreasonable Effectiveness of Mathematics" still holds 
true.  
Forget philosophical concepts like Yin and Yang, because, in some cases and 
contexts , entropy is negative.  
Just to make an example,
"Since the entropy H(S|O) can now become negative, erasing a system can result 
in a net gain of work (and a corresponding cooling of the environment)."
https://www.nature.com/nature/journal/v474/n7349/full/nature10123.html
--
Inviato da Libero Mail per Android
venerdì, 13 ottobre 2017, 10:11PM +02:00 da Sungchul Ji 
s...@pharmacy.rutgers.edu :

Hi Arturo,

(1)  I don't understand where you got (or how you can justify) S = 1 J/K in 
your statement,

"With the same probability mass function, you can see that H = S/(ln(2)*kB), so 
setting S = 1J/K gives a Shannon entropy of 1.045×1023 bits."

(2) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
this equality mean physically ?
(3) This reminds me of what Schroedinger did when he came up with the 
conclusion that "negative entropy" is equivalent to "order", which led to 
Brillouin's so-called the "negentropy Principle of Information (NPI)" [1, 2].

Simply by multiplying the both sides of the Boltzmann equation with negative 
one, Schroedinger obtained the following formula:

 - S = - k lnW = k ln (1/W)

and then equating W with disorder, D, led him to 

- S = k ln (1/D).

Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
he concluded that

"negative entropy = order".

As you can see, the above derivation is mathematically sound but the result 
violates the Third Law of Thermodynamics, according to which thermodynamic 
entropy cannot be less than zero.

Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
as follows [3]

"Schroedinger's paradox refers to the mathematical equations, concepts, or 
general statements that are formally true but physically meaningless." 

(4) If my argument in (3) is valid, this may provide an example of what may be 
called 

the "Unreasonable Ineffectiveness of Mathematics"

which, together with Wigner's "Unreasonable Effectiveness of Mathematics", may 
constitute an Yin-Yang pair of mathematics.  

All the best.

Sung



 





References:
   [1]  Brillouin, L. (1953).  Negentropy Principle of Information, J. Applied 
Phys. 24(9), 1152-1163.
   [2]  Brillouin, L. (1956). Science and Information Theory, Academic Press, 
Inc., New York, pp. 152-156.
   [3] Ji, S. (2012).  The Third Law of Thermodynamics 

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-14 Thread Sungchul Ji
Hi Arturo,


I agree.  Engtropy can be negative MATHEMATICALLY, as Shroedinger assumed.

But what I am claiming is that that may be a mathematical artifact, since, 
according to the Third Law of Thermodynamics, therer is no negative entropy.


All the best.


Sung



From: tozziart...@libero.it 
Sent: Friday, October 13, 2017 6:02 PM
To: Sungchul Ji
Cc: fis@listas.unizar.es
Subject: Re[2]: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION


Dear Sung,
I'm sorry, but the "Unreasonable Effectiveness of Mathematics" still holds true.
Forget philosophical concepts like Yin and Yang, because, in some cases and 
contexts , entropy is negative.
Just to make an example,
"Since the entropy H(S|O) can now become negative, erasing a system can result 
in a net gain of work (and a corresponding cooling of the environment)."

https://www.nature.com/nature/journal/v474/n7349/full/nature10123.html

--
Inviato da Libero Mail per Android

venerdì, 13 ottobre 2017, 10:11PM +02:00 da Sungchul Ji 
s...@pharmacy.rutgers.edu:


Hi Arturo,


(1)  I don't understand where you got (or how you can justify) S = 1 J/K in 
your statement,


"With the same probability mass function, you can see that H = S/(ln(2)*kB), so 
setting S = 1J/K gives a Shannon entropy of 1.045×1023 bits."


(2) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
this equality mean physically ?

(3) This reminds me of what Schroedinger did when he came up with the 
conclusion that "negative entropy" is equivalent to "order", which led to 
Brillouin's so-called the "negentropy Principle of Information (NPI)" [1, 2].


Simply by multiplying the both sides of the Boltzmann equation with negative 
one, Schroedinger obtained the following formula:


 - S = - k lnW = k ln (1/W)


and then equating W with disorder, D, led him to


- S = k ln (1/D).


Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
he concluded that


"negative entropy = order".


As you can see, the above derivation is mathematically sound but the result 
violates the Third Law of Thermodynamics, according to which thermodynamic 
entropy cannot be less than zero.


Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
as follows [3]


"Schroedinger's paradox refers to the mathematical equations, concepts, or 
general statements that are formally true but physically meaningless."


(4) If my argument in (3) is valid, this may provide an example of what may be 
called


the "Unreasonable Ineffectiveness of Mathematics"


which, together with Wigner's "Unreasonable Effectiveness of Mathematics", may 
constitute an Yin-Yang pair of mathematics.


All the best.


Sung











References:
   [1]  Brillouin, L. (1953).  Negentropy Principle of Information, J. Applied 
Phys. 24(9), 1152-1163.

   [2]  Brillouin, L. (1956). Science and Information Theory, Academic Press, 
Inc., New York, pp. 152-156.

   [3] Ji, S. (2012).  The Third Law of 
Thermodynamics
 and “Schroedinger’s Paradox”.  
In:Molecular Theory of the Living Cell: Concepts, Molecular Mechanisms, and 
Biomedical Applications.  Springer, New York.  pp. 12-15.  PDF at 
http://www.conformon.net/wp-content/uploads/2014/03/Schroedinger_paradox.pdf









From: tozziart...@libero.it 
>
Sent: Friday, October 13, 2017 4:43 AM
To: Sungchul Ji; fis@listas.unizar.es
Subject: R: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

Dear Sung,
One J/K corresponds to 1.045×1023 bits.

Indeed,
The Gibbs entropy formula states that thermodynamic entropy S equals 
kB*sum[pi*ln(1/pi)], with units of J/K, where kB is the Boltzmann constant and 
pi is the probability of microstate i. On the other hand, the Shannon entropy 
is defined as H = sum[pi*log2(1/pi)], with units of bits. With the same 
probability mass function, you can see that H = S/(ln(2)*kB), so setting S = 
1J/K gives a Shannon entropy of 1.045×1023 bits.

On the other side, The 
energy
 consumption per bit of data on the Internet is around 75 μJ at low access 
rates and decreases to around 2-4 μJ at an access rate of 100 Mb/s.
see:

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-13 Thread tozziarturo

Dear Sung, 
I'm sorry, but the "Unreasonable Effectiveness of Mathematics" still holds 
true.  
Forget philosophical concepts like Yin and Yang, because, in some cases and 
contexts , entropy is negative.  
Just to make an example,
"Since the entropy H(S|O) can now become negative, erasing a system can result 
in a net gain of work (and a corresponding cooling of the environment)."
https://www.nature.com/nature/journal/v474/n7349/full/nature10123.html
--
Inviato da Libero Mail per Android venerdì, 13 ottobre 2017, 10:11PM +02:00 da 
Sungchul Ji  s...@pharmacy.rutgers.edu :

>Hi Arturo,
>
>( 1 )  I don't understand where you got (or how you can justify) S = 1 J/K in 
>your statement,
>
>" With the same probability mass function, you can see that H = S/(ln(2)*k B ),
> so setting S = 1J/K gives a Shannon entropy of 1.045×10 23  bits."
>
>( 2 ) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
>this equality mean physically
> ?
>( 3 ) This reminds me of what Schroedinger did when he came up with the 
>conclusion that "negative entropy" is
> equivalent to "order", which led to Brillouin's so-called the "negentropy 
> Principle of Information (NPI)" [1, 2].
>
>Simply by multiplying the both sides of the Boltzmann equation with negative 
>one, Schroedinger obtained the following formula:
>
> - S = - k lnW = k ln (1/W)
>
>and then equating W with disorder, D, led him to 
>
>- S = k ln (1/D).
>
>Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
>he concluded that
>
>"negative entropy = order".
>
>As you can see, the above derivation is mathematically sound but the result 
>violates the Third Law of Thermodynamics,
> according to which thermodynamic entropy cannot be less than zero.
>
>Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
>as follows [3]
>
>"Schroedinger's paradox refers to the mathematical equations, concepts, or 
>general statements that are formally true
> but physically meaningless." 
>
>( 4 ) If my argument in ( 3 ) is valid, this may provide an example of what 
>may be called 
>
>the " Unreasonable Ineffectiveness of Mathematics "
>
>which, together with Wigner's " Unreasonable Effectiveness of Mathematics ", 
>may constitute an Yin-Yang pair
> of mathematics.  
>
>All the best.
>
>Sung
>
>
>
> 
>
>
>
>
>
>References:
>   [1]   Brillouin, L. (1953).  Negentropy Principle of Information, J. 
>Applied Phys. 24 (9),
> 1152- 1163.
>   [2]  Brillouin, L. (1956).  Science and Information Theory, Academic Press, 
>Inc., New York, pp. 152-156.
>   [3] Ji, S. (2012).   The Third Law of  Thermodynamics  and  “Schroedinger’s
> Paradox” .  In: Molecular Theory of the Living
> Cell: Concepts, Molecular Mechanisms, and Biomedical Applications.   
> Springer, New York.  pp. 12-15.  
> PDF at  http://www.conformon.net/wp-content/uploads/2014/03/Schroedinger_pa 
> radox.pdf
>
> 
>
>
>
>
>--
>From: tozziart...@libero.it < tozziart...@libero.it >
>Sent: Friday, October 13, 2017 4:43 AM
>To: Sungchul Ji;  fis@listas.unizar.es
>Subject: R: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION
> 
>Dear Sung, 
>One J/K corresponds to 1.045×10 23  bits.
>
>Indeed, 
>The Gibbs entropy formula states that thermodynamic entropy S equals k B 
>*sum[p i *ln(1/p i )],
> with units of J/K, where k B  is
> the Boltzmann constant and p i  is
> the probability of microstate i. On the other hand, the Shannon entropy is 
> defined as H = sum[p i *log 2 (1/p i )],
> with units of bits. With the same probability mass function, you can see that 
> H = S/(ln(2)*k B ),
> so setting S = 1J/K gives a Shannon entropy of 1.045×10 23  bits.
>
>On the other side, The energy consumption per bit of data on  the Internet is 
>around 75 μJ at low access rates and decreases  to
> around 2-4 μJ at an access rate of 100 Mb/s.
>see: 
>http://www.ing.unitn.it/~fontana/GreenInternet/Recent%20Papers%20and%20p2p/Baliga_Ayre_Hinton_Sorin_Tucker_JLT0
> . pdf
>
>Futher,  according to Landauer's theory, a minimum amount of heat – roughly 10 
>–21  J
> per erased bit – must be dissipated when information is destroyed.
>http://physicsworld.com/cws/article/news/2012/mar/12/wiping-data-will-cost-you-energy
>
>
>In other words, summarizing, if you use the free energy to assess the 
>information, it works the same, giving a quantifiable value.  
>
>
>Arturo Tozzi
>AA Professor Physics, University North Texas
>Pediatrician ASL Na2Nord, Italy
>Comput Intell Lab, University Manitoba
>http://arturotozzi.webnode.it/  
>
>
>>Messaggio originale
>>Da: "Sungchul Ji" < s...@pharmacy.rutgers.edu >
>>Data: 12/10/2017 22.08
>>A: "Francesco Rizzo"< 13francesco.ri...@gmail.com >, "Pedro C. Marijuan"< 
>>pcmarijuan.i...@aragon.es >
>>Cc: "fis@listas.unizar.es >> fis@listas.unizar.es"< fis@listas.unizar.es >
>>Ogg: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION
>>
>>Hi FISers,
>>
>>The following statement 

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-13 Thread Sungchul Ji
Hi Arturo,


(1)  I don't understand where you got (or how you can justify) S = 1 J/K in 
your statement,


"With the same probability mass function, you can see that H = S/(ln(2)*kB), so 
setting S = 1J/K gives a Shannon entropy of 1.045×1023 bits."


(2) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
this equality mean physically ?

(3) This reminds me of what Schroedinger did when he came up with the 
conclusion that "negative entropy" is equivalent to "order", which led to 
Brillouin's so-called the "negentropy Principle of Information (NPI)" [1, 2].


Simply by multiplying the both sides of the Boltzmann equation with negative 
one, Schroedinger obtained the following formula:


 - S = - k lnW = k ln (1/W)


and then equating W with disorder, D, led him to


- S = k ln (1/D).


Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
he concluded that


"negative entropy = order".


As you can see, the above derivation is mathematically sound but the result 
violates the Third Law of Thermodynamics, according to which thermodynamic 
entropy cannot be less than zero.


Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
as follows [3]


"Schroedinger's paradox refers to the mathematical equations, concepts, or 
general statements that are formally true but physically meaningless."


(4) If my argument in (3) is valid, this may provide an example of what may be 
called


the "Unreasonable Ineffectiveness of Mathematics"


which, together with Wigner's "Unreasonable Effectiveness of Mathematics", may 
constitute an Yin-Yang pair of mathematics.


All the best.


Sung











References:
   [1]  Brillouin, L. (1953).  Negentropy Principle of Information, J. Applied 
Phys. 24(9), 1152-1163.
   [2]  Brillouin, L. (1956). Science and Information Theory, Academic Press, 
Inc., New York, pp. 152-156.

   [3] Ji, S. (2012).  The Third Law of 
Thermodynamics and 
“Schroedinger’s Paradox”.  
In:Molecular Theory of the Living Cell: Concepts, Molecular Mechanisms, and 
Biomedical Applications.  Springer, New York.  pp. 12-15.  PDF at 
http://www.conformon.net/wp-content/uploads/2014/03/Schroedinger_paradox.pdf









From: tozziart...@libero.it 
Sent: Friday, October 13, 2017 4:43 AM
To: Sungchul Ji; fis@listas.unizar.es
Subject: R: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

Dear Sung,
One J/K corresponds to 1.045×1023 bits.

Indeed,
The Gibbs entropy formula states that thermodynamic entropy S equals 
kB*sum[pi*ln(1/pi)], with units of J/K, where kB is the Boltzmann constant and 
pi is the probability of microstate i. On the other hand, the Shannon entropy 
is defined as H = sum[pi*log2(1/pi)], with units of bits. With the same 
probability mass function, you can see that H = S/(ln(2)*kB), so setting S = 
1J/K gives a Shannon entropy of 1.045×1023 bits.

On the other side, The 
energy
 consumption per bit of data on the Internet is around 75 μJ at low access 
rates and decreases to around 2-4 μJ at an access rate of 100 Mb/s.
see:
http://www.ing.unitn.it/~fontana/GreenInternet/Recent%20Papers%20and%20p2p/Baliga_Ayre_Hinton_Sorin_Tucker_JLT0.pdf

Futher, according to Landauer's theory, a minimum amount of heat – roughly 
10–21 J per erased bit – must be dissipated when information is destroyed.
http://physicsworld.com/cws/article/news/2012/mar/12/wiping-data-will-cost-you-energy


In other words, summarizing, if you use the free energy to assess the 
information, it works the same, giving a quantifiable value.



Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/


Messaggio originale
Da: "Sungchul Ji" 
Data: 12/10/2017 22.08
A: "Francesco Rizzo"<13francesco.ri...@gmail.com>, "Pedro C. 
Marijuan"
Cc: "fis@listas.unizar.es >> fis@listas.unizar.es"
Ogg: Re: [Fis] A 

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-12 Thread Sungchul Ji
Hi FISers,


The following statement cannot be true.

"a proposal: information might stand for free energy."
Fore one thing, the unit of information is bits and that of energy is cal or 
erg.

The proper relation between information and energy (including free energy) may 
be complementarity, just as is the relation between wave and particle.   
According to the ITR (Irreducible Triadic Relation) model of of signs and 
communication, information and energy are entangled in the sense that both are 
irreplaceably implicated in the process of communication. Both information and 
energy are  needed for communication, the minimum energy cost of transmitting 
one bit of information being ~ 0.6 Kcal/mole, according to Shannon.

All the best.

Sung





From: Fis  on behalf of Francesco Rizzo 
<13francesco.ri...@gmail.com>
Sent: Thursday, October 12, 2017 3:00 AM
To: Pedro C. Marijuan
Cc: fis@listas.unizar.es >> fis@listas.unizar.es
Subject: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

Caro Pedro e cari tutti,
gli ingressi e le uscite delle cellule viventi con l'ambiente, non sono altro 
che materia, energia e informazione che entrano (INPUT) ed escono (OUTPUT)  
dando luogo al processo di TRAS-IN-FORM-AZIONE che ho elaborato nella Nuova 
Economia a proposito dei sistemi produttivi entropici (energia degradata o 
dis-informazione) e neg-entropici (energia libera o informazione) che hanno un 
carattere generale. Tanto è vero che circa 20 anni fa ho applicato e riferito 
alla cellula che stabilisce con l'ambiente (biologico-naturale) un rapporto 
simile a quello che l'intrapresa (azienda) stabilisce con l'ambiente 
(sociale-economico). In fondo la bio-chimica e l'economia risultano 
complementari nella vita degli uomini la cui esistenza e conoscenza possono ben 
comprendersi secondo la onto-logica empirica o concreta, altrimenti detta LIR, 
che la generosità di Joseph Brenner ha intravisto anche nella mia analisi 
scientifica. Purtroppo  questa problematica, ben espressa e sintetizzata dal 
processo di TRAS-IN-FORM-AZIONE e più volte oggetto di confronto e discussione 
nel dibattito Fis, è poco conosciuta perchè si ritrova esposta in una ventina 
dei miei libri scritti in italiano.
Comunque il TEMPO è (sempre galantuomo e fornisce) l'INFORMAZIONE giusta 
svolgendo la funzione della LINGUA delle LINGUE che tutti possono com-prendere, 
prima o poi. Grazie, per l'opportunità che mi date a partire da Pedro che ha il 
grande merito dell'iniziazione-mediazione in tal senso.
Un abbraccio, Francesco Rizzo.


2017-10-11 14:30 GMT+02:00 Pedro C. Marijuan 
>:
Dear Arturo and colleagues,

I think that relating information to free energy can be a good idea. I am not 
sure whether the expressions derived from Gibbs free energy (below) have 
sufficient generality; at least they work very well for chemical reactions. And 
it is in the biomolecular (chemical) realm where the big divide between 
"animate information" and "inanimate information" occurs. In that sense, I 
include herein the scheme we have just published of prokaryotic cells in their 
management of the "information flow". In a next message I will make suggestions 
on how the mapping of biological information may conduce to a more general 
approach that includes the other varieties of information (anthropocentric, 
physical, chemical, cosmological, etc). Biological information is the most 
fundamental and radical track to unite the different approaches!

Best--Pedro

Pedro C. Marijuán, Jorge Navarro, Raquel del Moral.
How prokaryotes ‘encode’ their environment: Systemic tools for organizing the 
information flow.
Biosystems.
 October  2017. 
https://doi.org/10.1016/j.biosystems.2017.10.002

Abstract
An important issue related to code biology concerns the cell’s informational 
relationships with the environment. As an open self-producing system, a great 
variety of inputs and outputs are necessary for the living cell, not only 
consisting of matter and energy but also involving information flows. The 
analysis here of the simplest cells will involve two basic aspects. On the one 
side, the molecular apparatuses of the prokaryotic signaling system, with all 
its variety of environmental signals and component pathways (which have been 
called 1–2-3 

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-12 Thread Francesco Rizzo
Caro Pedro e cari tutti,
gli ingressi e le uscite delle cellule viventi con l'ambiente, non sono
altro che materia, energia e informazione che entrano (INPUT) ed escono
(OUTPUT)  dando luogo al processo di TRAS-IN-FORM-AZIONE che ho elaborato
nella Nuova Economia a proposito dei sistemi produttivi entropici (energia
degradata o dis-informazione) e neg-entropici (energia libera o
informazione) che hanno un carattere generale. Tanto è vero che circa 20
anni fa ho applicato e riferito alla cellula che stabilisce con l'ambiente
(biologico-naturale) un rapporto simile a quello che l'intrapresa (azienda)
stabilisce con l'ambiente (sociale-economico). In fondo la bio-chimica e
l'economia risultano complementari nella vita degli uomini la cui esistenza
e conoscenza possono ben comprendersi secondo la onto-logica empirica o
concreta, altrimenti detta LIR, che la generosità di Joseph Brenner ha
intravisto anche nella mia analisi scientifica. Purtroppo  questa
problematica, ben espressa e sintetizzata dal processo di
TRAS-IN-FORM-AZIONE e più volte oggetto di confronto e discussione nel
dibattito Fis, è poco conosciuta perchè si ritrova esposta in una ventina
dei miei libri scritti in italiano.
Comunque il TEMPO è (sempre galantuomo e fornisce) l'INFORMAZIONE giusta
svolgendo la funzione della LINGUA delle LINGUE che tutti possono
com-prendere, prima o poi. Grazie, per l'opportunità che mi date a partire
da Pedro che ha il grande merito dell'iniziazione-mediazione in tal senso.
Un abbraccio, Francesco Rizzo.


2017-10-11 14:30 GMT+02:00 Pedro C. Marijuan :

> Dear Arturo and colleagues,
>
> I think that relating information to free energy can be a good idea. I am
> not sure whether the expressions derived from Gibbs free energy (below)
> have sufficient generality; at least they work very well for chemical
> reactions. And it is in the biomolecular (chemical) realm where the big
> divide between "animate information" and "inanimate information" occurs. In
> that sense, I include herein the scheme we have just published of
> prokaryotic cells in their management of the "information flow". In a next
> message I will make suggestions on how the mapping of biological
> information may conduce to a more general approach that includes the other
> varieties of information (anthropocentric, physical, chemical,
> cosmological, etc). Biological information is the most fundamental and
> radical track to unite the different approaches!
>
> Best--Pedro
>
> Pedro C. Marijuán, Jorge Navarro, Raquel del Moral.
> *How prokaryotes ‘encode’ their environment: Systemic tools for organizing
> the information flow.*
> Biosystems .
> October  2017. https://doi.org/10.1016/j.biosystems.2017.10.002
>
> *Abstract*
> An important issue related to code biology concerns the cell’s
> informational relationships with the environment. As an open self-producing
> system, a great variety of inputs and outputs are necessary for the living
> cell, not only consisting of matter and energy but also involving
> information flows. The analysis here of the simplest cells will involve two
> basic aspects. On the one side, the molecular apparatuses of the
> prokaryotic signaling system, with all its variety of environmental signals
> and component pathways (which have been called 1–2-3 Component Systems),
> including the role of a few second messengers which have been pointed out
> in bacteria too. And in the other side, the gene transcription system as
> depending not only on signaling inputs but also on a diversity of factors.
> Amidst the continuum of energy, matter, and information flows, there seems
> to be evidence for signaling codes, mostly established around the
> arrangement of life-cycle stages, in large metabolic changes, or in the
> relationships with conspecifics (quorum sensing) and within microbial
> ecosystems. Additionally, and considering the complexity growth of
> signaling systems from prokaryotes to eukaryotes, four avenues or “roots”
> for the advancement of such complexity would come out. A comparative will
> be established in between the signaling strategies and organization of both
> kinds of cellular systems. Finally, a new characterization of
> “informational architectures” will be proposed in order to explain the
> coding spectrum of both prokaryotic and eukaryotic signaling systems. Among
> other evolutionary aspects, cellular strategies for the construction of
> novel functional codes via the intermixing of informational architectures
> could be related to the persistence of retro-elements with obvious viral
> ancestry.
> ---
>
>
> El 10/10/2017 a las 11:14, tozziart...@libero.it escribió:
>
> Dear FISers,
> a proposal: information might stand for free energy.
>
> Indeed, we know that, for an engine:
> enthalpy = free energy + entropy x temperature.
>
> At a fixed temperature,
> enthalpy = free energy