Hi FISers,

I found a typo in the legend to Figure 1 in my last post:  ".. . .  without 
energy dissipation, no energy, . . ." shoud read

"Without energy dissipation, no information."                                   
                                                                    (4).


In fact, Statement (4) may be fundamental to informatics in general so that it 
may be referred to as the "First Principle of Informatics" (FPI).


If this conjecture is correct, FPI may apply to the controversioal 
interpretations of the wavefunction of a material system (WFMS), since WFMS is 
supposed to encode all the information we have about the material system under 
consideration and hence implicates "information".  It thus seems to me that the 
complete interpretation of a wavefucntion, according to FPI, must specify the 
selection process as well, i.e., the free energy-dissipating step, which I am 
tempted to identified with "measurement" , "quantum jump", or "wavefunction 
collapse".


I am not a quantum mechanician, so it is possible that I have committed some 
logical errors somewhere in my arguemnt above.  If you diectect any, please let 
me know.


With All the best.


Sung

________________________________
From: Fis <fis-boun...@listas.unizar.es> on behalf of Sungchul Ji 
<s...@pharmacy.rutgers.edu>
Sent: Sunday, June 3, 2018 12:13:11 AM
To: 'fis'
Subject: [Fis] The information-entropy relation clarified: The New Jerseyator


Hi  FISers,


One simple (and may be too simple) way to distinguish between information and 
entropy may be as follows:


(i)  Define  information (I) as in Eq. (1)


                  I = -log_2(m/n) = - log_2 (m) + log_2(n)                      
                                 (1)


where n is the number of all possible choices (also called variety) and m is 
the actual choices made or selected.


(ii) Define the negative binary logarithm of n, i.e., -log_2 (n), as the 
'variety' of all possible choices  and hence identical with Shannon entropy H, 
as suggested by Wicken [1].  Then Eq. (1) can be re-writtens as Eq. (2):


                   I = - log_2(m) - H                                           
                                                    (2)


(iii) It is evident that when m = 1 (i.e., when only one is chosen out of all 
the variety of choices available) , Eq. (2) reduces to Eq. (3):


                    I = - H                                                     
                                                             (3)


(iv) As is well known, Eq. (3) is the basis for the so-called the "negentropy 
priniciple of Information" frist advocated by Shroedinger followed by 
Brillouin,and others.  But Eq. (3) is clearly not a principle but a special 
case of Eq. (2)  with m = 1.


(v)  In conlcusion, I claim that information and negative entropry are not the 
same qualitatively nor quantiatively (except when m = 1 in Eq. (2)) and 
represent two opposite nodes of a fundamental triad [2]:







                                                                                
 Selection

                                        H 
---------------------------------------------------------------->  I
              (uncertainty before selection)                                    
     (Uncertainty after selection)






Figure 1.  The New Jerseyator model of information (NMI) [3].  Since selection 
requires free energy dissipation, NMI implicates both information and energy.  
That is, without energy dissipation, no energy, and hence NMI may be viewed as 
a self-organizing process (also called dissipative structure) or an ‘-ator’.  
Also NMI is consistent with “uncertainty reduction model of information.”



With all the best.


Sung


P.s.  There are experimetnal evidences that informattion and entropy are 
orthogonal, thus giving rise to the Planck-Shannon plane that has been shown to 
distiguish between cancer and healthy cell mRNA levels.  I will discus this in 
n a later post.



References:

   [1]  Wicken, J. S. (1987).  Entropy and information: suggestions for common 
language. Phil. Sci. 54: 176=193.
   [2] Burgin, M (2010).  Theory of Information: Funadamentality, Diversity, 
and Unification.  World Scientific Publishing, New Jersey,

   [3] Ji, S. (2018).  The Cell Langauge theory: Connecting Mind and Matter.  
World Scientific Publishing, New Jersey.  Figure 10.24.
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to