Re: [Fis] PLANCKIAN INFORMATION: A NEW MEASURE OF ORDER (From S. Ji)

2017-03-23 Thread Sungchul Ji
Hi Pedro,


Thanks for the excellent job done.


Sung



From: Fis <fis-boun...@listas.unizar.es> on behalf of Pedro C. Marijuan 
<pcmarijuan.i...@aragon.es>
Sent: Thursday, March 23, 2017 6:25 AM
To: 'fis'
Subject: [Fis] PLANCKIAN INFORMATION: A NEW MEASURE OF ORDER (From S. Ji)

Note: what follows is an abbreviated text taken from the presentation.
The whole file, too big for our list, can be found at fis web pages:
http://fis.sciforum.net/wp-content/uploads/sites/2/2014/11/Planckian_information.pdf
A very recent article developing similar ideas: 
http://www.mdpi.com/2078-2489/8/1/24
[http://www.mdpi.com/img/journals/information-logo-sq.png?a1aee442a5e8cd96]<http://www.mdpi.com/2078-2489/8/1/24>

Information | Free Full-Text | Waves as the Symmetry 
...<http://www.mdpi.com/2078-2489/8/1/24>
www.mdpi.com
In 1997, the author concluded that living cells use a molecular language 
(cellese) that is isomorphic with the human language (humanese) based on his 
finding that the ...



Greetings to all--Pedro
---


What is the Planckian information ?

SUNGCHUL JI

Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
s...@pharmacy.rutgers.edu<mailto:s...@pharmacy.rutgers.edu>


The Planckian information (I_P) is defined as the information produced (or 
used) by the so-called Planckian processes which are in turn defined as any 
physicochemical or formal processes that generate long-tailed histograms 
fitting the Planckian Distribution Equation (PDE),
   y = (A/(x + B^5)/(Exp(C/(x + B)) – 1)
  (1)
 where A, B and C are free parameters, x is the class or the bin to which  
objects or entities belong, and y is the frequency [1, 1a].  The PDE was 
derived in 2008 [2] from the blackbody radiation equation discovered by M. 
Planck (1858-1947) in 1900, by replacing the universal constants and 
temperature with free parameters, A, B and C.  PDE has been found to fit not 
only the blackbody radiation spectra (as it should) but also numerous other 
long-tailed histograms [3, 4] (see Figure 1).
One possible explanation for the universality of PDE is that many long-tailed 
histograms are generated by some selection mechanisms acting on 
randomly/thermally accessible processes [3]. Since random processes obey the 
Gaussian distribution, the ratio of the area under the curve (AUC) of PDE to 
that of Gaussian-like symmetric curves can be used as a measure of 
non-randomness or the order generated by the Planckian processes.

As can be seen in Figs. 1 (g), (i), (k), (o), (r) and (t), the curves labeled 
‘Gaussian’ or ‘Gaussian-like’ overlap with the rising phase of the PDE curves.  
The ‘Gaussian-like’ curves were generated by Eq. (2), which was derived from 
the Gaussian equation by replacing its pre-exponential factor with free 
parameter A:

  y = Ae– (x – μ)^2/(2σ^2)  
  (2)

The degree of mis-match between the area under the curve (AUC) of PDE, Eq. (1), 
and that of GLE, Eq. (2), is postulated to be a measure of non-randomness (and 
hence order).  GLE is associated with random processes, since it is symmetric 
with respect to the sign reversal of in its exponential term, (x - µ).  This 
measure of order is referred to as the Planckian Information (IP) defined 
quantitatively as shown in Eq. (3) or Eq. (4):

  IP = log2 (AUC(PDE)/AUC(GLE))   bits  
   (3)
or

  IP  = log2 [∫P(x)dx/∫G(x)dx] bits 
(4)

where P(x) and G(x) are the Plackian Distribution Equation and the 
Gaussian-Like Equation, respectively.

It is generally accepted that there are at least three basic aspects to 
information – amount, meaning, and value.  Planckian information is primarily 
concerned with the amount (and hence the quantitative aspect) of information.  
There are numerous ways that have been suggested in the literature for 
quantifying information bedside the well-known Hartley information, Shannon 
entropy, algorithmic information, etc [5].  The Planckian information, given by 
Equation (3), is a new measure of information that applies to the Planckian 
process generally defined as in (5):

“Planckian processes are the physicochemical, neurophysiological,   
 (5)
biomedical, mental, linguistic, socioeconomic, cosmological, or any
other processes that generate long-tailed histograms obeying the
Planckian distribution equation (PDE).”

The Planckian information represents the degree of organization of physical (or 
nonphysical) systems in contrast to the Boltzmann or the Boltzmann-Gibbs 
entropy

[Fis] PLANCKIAN INFORMATION: A NEW MEASURE OF ORDER (From S. Ji)

2017-03-23 Thread Pedro C. Marijuan

Note: what follows is an abbreviated text taken from the presentation.
The whole file, too big for our list, can be found at fis web pages:
http://fis.sciforum.net/wp-content/uploads/sites/2/2014/11/Planckian_information.pdf
A very recent article developing similar ideas: 
http://www.mdpi.com/2078-2489/8/1/24

Greetings to all--Pedro
--- 




*What is the Planckian information ?*

*S**UNGCHUL JI*

/Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University/
/s...@pharmacy.rutgers.edu/

*
*
The Planckian information (I_P) is defined as the information produced 
(or used) by the so-called Planckian processes which are in turn defined 
as any physicochemical or formal processes that generate long-tailed 
histograms fitting the Planckian Distribution Equation (PDE),


y = (A/(x + B^5)/(Exp(C/(x + B)) – 1)(1)

where A, B and C are free parameters, x is the class or the bin to 
whichobjects or entities belong, and y is the frequency [1, 1a].The PDE 
was derived in 2008 [2] from the blackbody radiation equation discovered 
by M. Planck (1858-1947) in 1900, by replacing the universal constants 
and temperature with free parameters, A, B and C.PDE has been found to 
fit not only the blackbody radiation spectra (as it should) but also 
numerous other long-tailed histograms [3, 4] (see Figure 1).


One possible explanation for the universality of PDE is that many 
long-tailed histograms are generated by some selection mechanisms acting 
on randomly/thermally accessible processes [3]. Since random processes 
obey the Gaussian distribution, the ratio of the area under the curve 
(AUC) of PDE to that of Gaussian-like symmetric curves can be used as a 
measure of non-randomness or the order generated by the Planckian processes.


As can be seen in *Figs. 1 (g), (i), (k), (o), (r) *and*(t), *the curves 
labeled ‘Gaussian’ or ‘Gaussian-like’ overlap with the rising phase of 
the PDE curves.The ‘Gaussian-like’ curves were generated by Eq. (2), 
which was derived from the Gaussian equation by replacing its 
pre-exponential factor with free parameter A:


y = Ae^– (x – ^μ ^)^2/(2 ^σ ^^2) (2)

The degree of mis-match between the area under the curve (AUC) of PDE, 
Eq. (1), and that of GLE, Eq. (2), is postulated to be a measure of 
/non-randomness/ (and hence /order/).GLE is associated with random 
processes, since it is symmetric with respect to the sign reversal of in 
its exponential term, (x - µ).This /measure of order/ is referred to as 
the Planckian Information (I_P ) defined quantitatively as shown in Eq. 
(3) or Eq. (4):


I_P = log_2 (AUC(PDE)/AUC(GLE))bits(3)

or


I_P= log_2 [∫P(x)dx/∫G(x)dx]bits(4)

where P(x) and G(x) are the Plackian Distribution Equation and the 
Gaussian-Like Equation, respectively.


It is generally accepted that there are at least three basic aspects to 
information – /amount/, /meaning, /and /value. //Planckian information/ 
is primarily concerned with the /amount/ (and hence the /quantitative/ 
aspect) of information.There are numerous ways that have been suggested 
in the literature for /quantifying information/ bedside the well-known 
Hartley information, Shannon entropy, algorithmic information, etc 
[5].The Planckian information, given by Equation (3), is a new measure 
of information that applies to the /Planckian process/ generally defined 
as in (5):


“Planckian processes are the physicochemical, neurophysiological, (5)
biomedical, mental, linguistic, socioeconomic, cosmological, or any

other processes that generate long-tailed histograms obeying the
Planckian distribution equation (PDE).”

The Planckian information represents the degree of organization of 
physical (or nonphysical) systems in contrast to the Boltzmann or the 
Boltzmann-Gibbs entropy which represents the disorder/disorganization of 
a physical system, whether the system involved is atoms, enzymes, cells, 
brains, human societies, or the Universe.I_P is related to the 
“organized complexity” and S is realted to “disorganized complexity” of 
Weaver [6].The organization represented by I_P results from 
/symmetry-breaking selection/ /processes /applied to some randomly 
accessible (and hence symmetrically distributed) processes, whether the 
system involved is atoms, enzymes, cells, brains, languages, human 
societies, or the Universe [3, 4], as schematically depicted in *Figure 2*.


There is a great confusion in science and philosophy concerning the 
relation between the concepts of /information/ and /entropy/ as pointed 
out by Wicken [7].A large part of this confusion may be traced back to 
the suggestions made by Schrödinger in 1944 [8] and others subsequently 
(e.g., von Neumann, Brillouin, etc.) that /order/ can be measured as the 
/inverse of/ /disorder/ (D) and hence that information can be measured 
as negative entropy (see the second column in *Table 1*).