Re: [Fis] Are there 3 kinds of motions in physics and biology?

2018-05-08 Thread Sungchul Ji
Hi Michel,


Thank you for your informative comments and helpful suggestions in your earlier 
post (which I happened to have deleted by accident).  In any case I have a copy 
of the post so I can answer your questions raised therein.


(1)  I am defining the Planckian information, I_P, as the information required 
to transform a symmetric, Gaussian-like equation (GLE), into the Planckian 
distribution.  which is the Gaussian distribution with the pre-exponential 
factor replaced with a free parameter, A,   i.e., y = A*exp(-(\m - x)^2/2\s^2), 
which was found to overlap with PDE (Planckian Distribution Equation) in the 
rising phase.  So far we have two different ways of quantifying I_P: (i) the 
Plamck informaiton of the fist kind, i_PF = log_2 [AUC(PDE)/AUC(GLE)], where 
AUC is the area under the curve, and (ii) the Planckian information of the 
second kind, I_PS = -log_2[(\m -mode)/ \s], which applies to right-skewed 
long-tailed histograms only.  To make it apply also to the left-skewed 
long-tailed histograms, it would be necessary to replace (\m - mode) with its 
absolute value, i.e., |\m - mode|.


(2)  There can be more than two kinds of Planckian information, including what 
may be called the Planckian information of the third kind, i.e., I_PT = - 
long_2 (\chi), as you suggest.  (By the way, how do you define \chi ?).


(3)  The definition of Planckian information given in (1) implies that  I_P is 
associated with asymmetric distribution generated by distorting the symmetric 
Gaussian-like distribution by transforming the x coordinate non-linearly while 
keeping the y-coordinate of the Gaussian distribution invariant [1].




   GP   
definition
  Gaussian-like Distribution -> PDE 
> IP



Figure 1.  The definitions of the Gaussian process (GP) and the Planckian 
information (IP) based on PDE, Planckian Distribution Equation.  GP is the 
physicochemical process generating a long-tailed histogram fitting PDE.




(4)  I am assuming that the PDE-fitting asymmetric histograms will always have 
non-zero measures of asymetry.

(5)  I have shown in [1] that the human decision-making process is an example 
of the Planckian process that can be derived from a Gaussian distribution based 
on the drift-diffusion model well-known in the field of decision-making 
psychophysics.

Reference:
   [1] Ji, S. (2018).  The Cell Language theory: Connecting Mind and Matter.  
World Scientific Publishing, New Jersey.   Figure 8.7, p. 357.

All the best.

Sung





From: Fis  on behalf of Michel Petitjean 

Sent: Monday, May 7, 2018 2:05 PM
To: fis
Subject: Re: [Fis] Are there 3 kinds of motions in physics and biology?

Dear Karl,
In my reply to Sung I was dealing with the asymmetry of probability
distributions.
Probability distributions are presented on the Wikipedia page:
https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FProbability_distribution&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7C171407db4122453fe72a08d5b4465e1f%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C0%7C636613136684543650&sdata=mMWRW6FO6hrflqQRGhXtoTkhDqt0FTspjtT9YGgNn2c%3D&reserved=0
Don't read all this page, the beginning should suffice.
Then, the skewness is explained on an other wiki page:
https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FSkewness&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7C171407db4122453fe72a08d5b4465e1f%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C0%7C636613136684543650&sdata=HQh0OOxgyE5fXZMMEfZF6mG5S0yKOPNjoEPO%2FNo28rA%3D&reserved=0
Possibly the content of these two pages is unclear for you.
In order to avoid a huge of long and non necessary explanations, you
may tell me what you already know about probability distributions and
what was unclear from my post, then I can explain more efficiently.
However, I let Sung explain about his own post :)
Best regards,
Michel.

2018-05-07 19:55 GMT+02:00 Michel Petitjean :
> Dear Karl,
> Yes I can hear you.
> About symmetry, I shall soon send you an explaining email, privately, because 
> I do not want to bother the FISers with long explanations (unless I am 
> required to do it).
> However, I confess that many posts that I receive from the FIS list are very 
> hard to read, and often I do not understand their deep content :)
> In fact, that should not be shocking: few people are able to read texts from 
> very diverse fields (as it occurs in the FIS forum), and I am not one of them.
> Even the post of Sung was unclear for me, and it is exactly why I asked him 
> questions, but only on the points that I may have a chance to understand (may 
> be).
> Best regards,
> Michel.
>
___
Fis mailing list
Fis@listas.unizar.es
https://na01.safelin

Re: [Fis] Are there 3 kinds of motions in physics and biology?

2018-05-07 Thread Michel Petitjean
Dear Karl,
In my reply to Sung I was dealing with the asymmetry of probability
distributions.
Probability distributions are presented on the Wikipedia page:
https://en.wikipedia.org/wiki/Probability_distribution
Don't read all this page, the beginning should suffice.
Then, the skewness is explained on an other wiki page:
https://en.wikipedia.org/wiki/Skewness
Possibly the content of these two pages is unclear for you.
In order to avoid a huge of long and non necessary explanations, you
may tell me what you already know about probability distributions and
what was unclear from my post, then I can explain more efficiently.
However, I let Sung explain about his own post :)
Best regards,
Michel.

2018-05-07 19:55 GMT+02:00 Michel Petitjean :
> Dear Karl,
> Yes I can hear you.
> About symmetry, I shall soon send you an explaining email, privately, because 
> I do not want to bother the FISers with long explanations (unless I am 
> required to do it).
> However, I confess that many posts that I receive from the FIS list are very 
> hard to read, and often I do not understand their deep content :)
> In fact, that should not be shocking: few people are able to read texts from 
> very diverse fields (as it occurs in the FIS forum), and I am not one of them.
> Even the post of Sung was unclear for me, and it is exactly why I asked him 
> questions, but only on the points that I may have a chance to understand (may 
> be).
> Best regards,
> Michel.
>
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Are there 3 kinds of motions in physics and biology?

2018-05-07 Thread Michel Petitjean
Dear Karl,
Yes I can hear you.
About symmetry, I shall soon send you an explaining email, privately,
because I do not want to bother the FISers with long explanations
(unless I am required to do it).
However, I confess that many posts that I receive from the FIS list
are very hard to read, and often I do not understand their deep
content :)
In fact, that should not be shocking: few people are able to read
texts from very diverse fields (as it occurs in the FIS forum), and I
am not one of them.
Even the post of Sung was unclear for me, and it is exactly why I
asked him questions, but only on the points that I may have a chance
to understand (may be).
Best regards,
Michel.

2018-05-07 17:55 GMT+02:00 Karl Javorszky :
> Dear Michel and Sung,
>
> Your discussion is way above my head in the jargon and background knowledge.
> Please bear with me while a non-mathematician tries to express some 
> observations that regard symmetry.
>
> Two almost symmetrical spaces appear as Gestalts, expressed by numbers, if 
> one orders and reorders the expression a+b=c. One uses natural numbers – in 
> the range of 1..16 – to create a demo collection, which one then sorts and 
> re-sorts ad libitum / ad nauseam. The setup of the whole exercise does not 
> take longer than 1, max 2 hours. Then one can observe patterns.
>
> The patterns here specifically referred to are two – almost – symmetrical 
> rectangular, orthogonal spaces. As these patterns are derived from simple 
> sorting operations on natural numbers, one can well argue that they represent 
> fundamental pictures.
>
> The generating algorithm is 5 lines of code. Here it is.
>
> #d=16
>
> begin outer loop, i:1,d
> begin inner loop, j:i,d
> append new record
> write
>  a=i, b=j, c=a+b, k=b-2a, u=b-a, t=2b-3a,
> q=a-2b, s=(d+1)-(a+b), w=2a-3b
> end inner loop
> end outer loop
>

> The next step is to sequence (sort, order) the rows. We use 2 sorting 
> criteria: as first, any one of {a,b,c,k,u,t,q,s,w}, and as 2nd sorting 
> criterium any of the remaining 8. This makes each of the 9 aspects of a+b=c 
> to be once a first, and once a second sorting key. We register the linear 
> sequential number of each element in a column for each of the 72 catalogued 
> sorting orders..
>
> Do you think the idea of symmetry is somehow connected to some very basic 
> truths of logic? Then maybe the small effort to create a database with 136 
> rows and 9+72 columns is possible.
>
> The trick begins with the next step:
>
> We go through the 72 sorting orders and re-sort from each of them into all 
> and each of the remaining 71. We register the sequential place of the element 
> in the order αβ while being resorted into order γδ. This gives each element a 
> value (a linear place, 1..136) “from” and a value “to”. The element is given 
> the attributes: Element: a,b, “Old Order”: αβ, from place nr i, “New Order” 
> γδ, to place nr. j. While doing this, one will realise, that reorganisations 
> happen by means of cycles, and will add attributes :
> Cycle nr: k, Within cycle step nr:. l. This is simple counting and using 
> logical flags.
>
> The cycles, that we have now arrived at, give a very useful skeleton for any 
> and all theories about order. You will find the two Euclid-type spaces by 
> filtering out those reorganisations that consist of 46 cycles, of which 45 
> have 3 elements in their corpus, where each of the 45 cycles has Σa=18, Σb=33.
>
> The two rectangular spaces – created by paths of elements during resorting – 
> are not quite symmetrical. As an outsider, I’d believe that there is 
> something to awake the natural curiosity of mathematicians.
>
> Hoping to have caught your interest.
>
> Karl
>

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Are there 3 kinds of motions in physics and biology?

2018-05-07 Thread Karl Javorszky
Dear Michel and Sung,



Your discussion is way above my head in the jargon and background
knowledge. Please bear with me while a non-mathematician tries to express
some observations that regard symmetry.



Two almost symmetrical spaces appear as Gestalts, expressed by numbers, if
one orders and reorders the expression *a+b=c. *One uses natural numbers –
in the range of 1..16 – to create a demo collection, which one then sorts
and re-sorts ad libitum / ad nauseam. The setup of the whole exercise does
not take longer than 1, max 2 hours. Then one can observe patterns.



The patterns here specifically referred to are two – almost – symmetrical
rectangular, orthogonal spaces. As these patterns are derived from simple
sorting operations on natural numbers, one can well argue that they
represent fundamental pictures.



The generating algorithm is 5 lines of code. Here it is.



*#d=16*





*begin outer loop, i:1,d*







*begin inner loop, j:i,d*







*append new record*

*write*

* a=i, b=j, c=a+b, k=b-2a, u=b-a, t=2b-3a,*

*q=a-2b, s=(d+1)-(a+b), w=2a-3b*



*end inner loop*



*end outer loop*







The next step is to *sequence* (sort, order) the rows. We use 2 sorting
criteria: as first, any one of {a,b,c,k,u,t,q,s,w}, and as 2nd sorting
criterium any of the remaining 8. This makes each of the 9 aspects of
*a+b=c* to be once a first, and once a second sorting key. We register the
linear sequential number of each element in a column for each of the 72
catalogued sorting orders..

Do you think the idea of symmetry is somehow connected to some very basic
truths of logic? Then maybe the small effort to create a database with 136
rows and 9+72 columns is possible.



The trick begins with the next step:

We go through the 72 sorting orders and re-sort from each of them into all
and each of the remaining 71. We register the sequential place of the
element in the order αβ while being resorted into order γδ. This gives each
element a value (a linear place, 1..136) “from” and a value “to”. The
element is given the attributes: Element: *a,b, *“Old Order”: αβ, from
place nr *i*, “New Order” γδ, to place nr. j. While doing this, one will
realise, that reorganisations happen by means of *cycles, *and will add
attributes : Cycle nr: *k, *Within cycle step nr:. *l.* This is simple
counting and using logical flags.



The cycles, that we have now arrived at, give a very useful skeleton for
any and all theories about order. You will find the two Euclid-type spaces
by filtering out those reorganisations that consist of 46 cycles, of which
45 have 3 elements in their corpus, where each of the 45 cycles has
Σa=18, Σb=33.




The two rectangular spaces – created by paths of elements during resorting
– are not quite symmetrical. As an outsider, I’d believe that there is
something to awake the natural curiosity of mathematicians.



Hoping to have caught your interest.



Karl


2018-05-07 15:06 GMT+02:00 Michel Petitjean :

> Dear Sung,
>
> The formula of the Planckian information in Table 1 is intriguing.
> The argument of the log_2 function was proposed in 1895 by Karl Pearson as
> a measure of asymmetry of a distribution (see [1], p. 370).
> In general the mean can be smaller than the mode (so the log cannot
> exist), but I assume that in your context that cannot happen.
> Also, I assume that this context excludes distributions such as a mixture
> of two well separated unit variance Gaussian laws, for which the mean is
> located at an antimode, and not at a mode.
>
> The skewness, which is also used as an asymmetry coefficient, is the
> reduced third order centered moment (may be positive or negative).
> The square of this latter quantity was also introduced by Karl Pearson as
> a measure of asymmetry of a distribution (see [1], p. 351).
>
> So, all these quantities are used as asymmetry measures.
>
> Two questions arise:
> 1. Has the Planckian information some relations with symmetry or asymmetry?
> If yes, which ones?
> That would not be shocking: Shu-Kun Lin (refs [2,3]) discussed about
> relations between information and symmetry.
> 2. The asymmetry measures above have a major drawback: a null value can be
> observed for some families of asymmetric distributions, and not only for
> symmetric distributions.
>
> In the case you indeed need to consider the log of a non negative quantity
> measuring the asymmetry of a distribution, which vanishes if and only if
> the distribution is symmetric, you may consider the chiral index \chi
> (section 2.9, ref [4]).
> \chi index takes values in [0..1] (in fact, in [0..1/2]) for univariate
> probability distributions, and it is null if and only if the distribution
> is symmetric.
> It has other properties, but that falls out of the scope of this
> discussion.
> Then, simply replace [ (\mu-mode) / \sigma ] by \chi as the argument of
> log_2.
>
> [1] Pearson, K.
> Contributions to the Mathematical Theory of Evolution,-II. Skew Variation
> in Homogeneous Material.
> Phil. Trans. 

Re: [Fis] Are there 3 kinds of motions in physics and biology?

2018-05-07 Thread Michel Petitjean
Dear Sung,

The formula of the Planckian information in Table 1 is intriguing.
The argument of the log_2 function was proposed in 1895 by Karl Pearson as
a measure of asymmetry of a distribution (see [1], p. 370).
In general the mean can be smaller than the mode (so the log cannot exist),
but I assume that in your context that cannot happen.
Also, I assume that this context excludes distributions such as a mixture
of two well separated unit variance Gaussian laws, for which the mean is
located at an antimode, and not at a mode.

The skewness, which is also used as an asymmetry coefficient, is the
reduced third order centered moment (may be positive or negative).
The square of this latter quantity was also introduced by Karl Pearson as a
measure of asymmetry of a distribution (see [1], p. 351).

So, all these quantities are used as asymmetry measures.

Two questions arise:
1. Has the Planckian information some relations with symmetry or asymmetry?
If yes, which ones?
That would not be shocking: Shu-Kun Lin (refs [2,3]) discussed about
relations between information and symmetry.
2. The asymmetry measures above have a major drawback: a null value can be
observed for some families of asymmetric distributions, and not only for
symmetric distributions.

In the case you indeed need to consider the log of a non negative quantity
measuring the asymmetry of a distribution, which vanishes if and only if
the distribution is symmetric, you may consider the chiral index \chi
(section 2.9, ref [4]).
\chi index takes values in [0..1] (in fact, in [0..1/2]) for univariate
probability distributions, and it is null if and only if the distribution
is symmetric.
It has other properties, but that falls out of the scope of this discussion.
Then, simply replace [ (\mu-mode) / \sigma ] by \chi as the argument of
log_2.

[1] Pearson, K.
Contributions to the Mathematical Theory of Evolution,-II. Skew Variation
in Homogeneous Material.
Phil. Trans. Roy. Soc. London (A.), 1895, 186, 343-414.

[2] Lin, S.K.
Correlation of Entropy with Similarity and Symmetry.
J. Chem. Inf. Comput. Sci. 1996, 36, 367--376

[3] Lin, S.K.
The Nature of the Chemical Process. 1. Symmetry Evolution –Revised
Information Theory, Similarity Principle and Ugly Symmetry.
Int. J. Mol. Sci. 2001, 2, 10--39
(available in open access)

[4] Petitjean, M.
Chirality and Symmetry Measures: A Transdisciplinary Review.
Entropy, 2003, 5[3], 271--312.
(available in open access)

Best regards,

Michel.

Michel Petitjean
MTi, INSERM UMR-S 973, University Paris 7,
CNRS SNC 9079
35 rue Helene Brion, 75205 Paris Cedex 13, France.
Phone: +331 5727 8434; Fax: +331 5727 8372
E-mail: petitjean.chi...@gmail.com (preferred),
michel.petitj...@univ-paris-diderot.fr
http://petitjeanmichel.free.fr/itoweb.petitjean.symmetry.html


2018-05-07 4:08 GMT+02:00 Sungchul Ji :

> Hi FISers,
>
> I think information and energy are inseparable in reality.  Hence to
> understand what information is, it may be helpful to understand what energy
> (and the associated concept of motion) is.  In this spirit, I am forwarding
> the following email that I wrote motivated by the lecture given by Dr.
> Grossberg this afternoon at the 119th Statistical Mechanics Conference.  In 
> *Table
> 1* in the email, I divided particle motions studied in physics and
> biology into three classes -- (i) *random*, (ii) *passive*, and (iii)
> *active*, and identified the field of specialization wherein these
> motions are studied as (i) *statistical mechanics*, (ii) *stochastic
> mechanics*, and (iii) *info-statistical mechanics*.  The last term was
> coined by me in 2012  in [1].  I will be presenting a short talk (5
> minutes) on* Info-statistical mechanics* on Wednesday, May 9, at the
> above meeting.   The abstract of the short talk is given below:
>
> Short talk to be presented at the *119th Statistical Mechanics Conference*,
> Rutgers University, Piscataway, N.J., May 6-9, 2018).
>
>
>
> *Planckian Information** may be to Info-Statistical Mechanics what
> Boltzmann Entropy is to Statistical Mechanics. *
>
> Sungchul Ji, Department of Pharmacology and Toxicology, Ernest Mario
> School of Pharmacy, Rutgers University, Piscataway, N.J. 08854
>
> Traditionally, the dynamics of any N-particle systems in statistical
> mechanics is completely described in terms of the 6-dimensional *phase
> space* consisting of the 3N positional coordinates and 3N momenta, where
> N is the number of particles in the system [1]. Unlike the particles dealt
> with in statistical mechanics which are featureless and shapeless, the
> particles in biology have characteristic shapes and internal structures
> that determine their biological properties.  The particles in physics are
> completely described in terms of energy and matter in the phase space but
> the description of the particles in living systems require not only the
> energy and matter of the particle but also their genetic information,
> consistent with the information-energy compl

Re: [Fis] Are there 3 kinds of motions in physics and biology?

2018-05-07 Thread Francesco Rizzo
Caro Sung e cari tutti,

"I think information and energy are inseparable in reality": è vero anche
in economia.

La Parte Terza--Teoria del valore: energia e informazione--  di "Valore e
valutazioni. La scienza dell'economia o l'economia della scienza"
(FrancoAngeli, Milano, 1995-1999) è costituita dalle pagine 451-646
contenenti questa interessante e significativa problematica.

Grazie e auguri.
Francesco

2018-05-07 4:08 GMT+02:00 Sungchul Ji :

> Hi FISers,
>
> I think information and energy are inseparable in reality.  Hence to
> understand what information is, it may be helpful to understand what energy
> (and the associated concept of motion) is.  In this spirit, I am forwarding
> the following email that I wrote motivated by the lecture given by Dr.
> Grossberg this afternoon at the 119th Statistical Mechanics Conference.  In 
> *Table
> 1* in the email, I divided particle motions studied in physics and
> biology into three classes -- (i) *random*, (ii) *passive*, and (iii)
> *active*, and identified the field of specialization wherein these
> motions are studied as (i) *statistical mechanics*, (ii) *stochastic
> mechanics*, and (iii) *info-statistical mechanics*.  The last term was
> coined by me in 2012  in [1].  I will be presenting a short talk (5
> minutes) on* Info-statistical mechanics* on Wednesday, May 9, at the
> above meeting.   The abstract of the short talk is given below:
>
> Short talk to be presented at the *119th Statistical Mechanics Conference*,
> Rutgers University, Piscataway, N.J., May 6-9, 2018).
>
>
>
> *Planckian Information** may be to Info-Statistical Mechanics what
> Boltzmann Entropy is to Statistical Mechanics. *
>
> Sungchul Ji, Department of Pharmacology and Toxicology, Ernest Mario
> School of Pharmacy, Rutgers University, Piscataway, N.J. 08854
>
> Traditionally, the dynamics of any N-particle systems in statistical
> mechanics is completely described in terms of the 6-dimensional *phase
> space* consisting of the 3N positional coordinates and 3N momenta, where
> N is the number of particles in the system [1]. Unlike the particles dealt
> with in statistical mechanics which are featureless and shapeless, the
> particles in biology have characteristic shapes and internal structures
> that determine their biological properties.  The particles in physics are
> completely described in terms of energy and matter in the phase space but
> the description of the particles in living systems require not only the
> energy and matter of the particle but also their genetic information,
> consistent with the information-energy complementarity (or gnergy)
> postulate discussed in [2, Section 2.3.2].  Thus, it seems necessary to
> expand the dimensionality of the traditional phase space to accommodate the 
> *information
> *dimension, which includes the three coordinates encoding the *amount *(in
> bits), *meaning* (e.g., recognizability), and *value* (e.g., practical
> effects) of information [2, Section 4.3]. Similar views were expressed by
> Bellomo et al. [3] and Mamontov et al. [4].  The expanded “phase space”
> would comprise the 6N phase space of traditional statistical mechanics plus
> the 3N information space entailed by molecular biology.  The new space
> (to be called the “gnergy space”) composed of these two subspaces would
> have 9N dimensions as indicated in Eq. (1).  This equation also makes
> contact with the concepts of  *synchronic* and *diachronic* informations
> discussed in [2, Section 4.5].  It was suggested therein that the
> traditional 6N-dimensional phase space deals with  the *synchronic
> information* and hence was referred to as the *Synchronic Space* while
> the 3N-dimensional information space is concerned with the consequences of
> history and evolution encoded in each particle and thus was referred to as
> the *Diachronic Space*.  The resulting space was called the *gnergy space*
> (since it encodes not only *energy* but also *information*).
>
>
>
>*Gnergy Space* =  *6N-D Phase Space*  +  *3N-D  Information
> Space*(1)
>
> (*Synchronic Space*)   
> (*Diachronic
> Space*)
>
>
>
> The study of both *energy* and *information* was defined as
> “info-statistical mechanics” in 2012 [2, pp. 102-106, 297-301].  The
> Planckian information of the second kind, IPS, [5] was defined as the
> negative of the binary logarithm of the skewness of the long-tailed
> histogram that fits the Planckian Distribution Equation (PDE) [6].   In *Table
> 1*, the Planckian information is compared to the Boltzmann entropy in the
> context of the complexity theory of Weaver [8]. The inseparable relation
> between *energy *and *information* that underlies “info-statistical
> mechanics” may be expressed by the following aphorism:
>
>
> *“Information without energy is useless; Energy without information is
> valueless.”*
>
>
>
> *Table 1.*  A comparison between Planckian Information (of the second
> kind) and B

[Fis] Are there 3 kinds of motions in physics and biology?

2018-05-06 Thread Sungchul Ji
Hi FISers,

I think information and energy are inseparable in reality.  Hence to understand 
what information is, it may be helpful to understand what energy (and the 
associated concept of motion) is.  In this spirit, I am forwarding the 
following email that I wrote motivated by the lecture given by Dr. Grossberg 
this afternoon at the 119th Statistical Mechanics Conference.  In Table 1 in 
the email, I divided particle motions studied in physics and biology into three 
classes -- (i) random, (ii) passive, and (iii) active, and identified the field 
of specialization wherein these motions are studied as (i) statistical 
mechanics, (ii) stochastic mechanics, and (iii) info-statistical mechanics.  
The last term was coined by me in 2012  in [1].  I will be presenting a short 
talk (5 minutes) on Info-statistical mechanics on Wednesday, May 9, at the 
above meeting.   The abstract of the short talk is given below:

Short talk to be presented at the 119th Statistical Mechanics Conference, 
Rutgers University, Piscataway, N.J., May 6-9, 2018).

Planckian Information may be to Info-Statistical Mechanics what Boltzmann 
Entropy is to Statistical Mechanics.
Sungchul Ji, Department of Pharmacology and Toxicology, Ernest Mario School of 
Pharmacy, Rutgers University, Piscataway, N.J. 08854
Traditionally, the dynamics of any N-particle systems in statistical mechanics 
is completely described in terms of the 6-dimensional phase space consisting of 
the 3N positional coordinates and 3N momenta, where N is the number of 
particles in the system [1]. Unlike the particles dealt with in statistical 
mechanics which are featureless and shapeless, the particles in biology have 
characteristic shapes and internal structures that determine their biological 
properties.  The particles in physics are completely described in terms of 
energy and matter in the phase space but the description of the particles in 
living systems require not only the energy and matter of the particle but also 
their genetic information, consistent with the information-energy 
complementarity (or gnergy) postulate discussed in [2, Section 2.3.2].  Thus, 
it seems necessary to expand the dimensionality of the traditional phase space 
to accommodate the information dimension, which includes the three coordinates 
encoding the amount (in bits), meaning (e.g., recognizability), and value 
(e.g., practical effects) of information [2, Section 4.3]. Similar views were 
expressed by Bellomo et al. [3] and Mamontov et al. [4].  The expanded “phase 
space” would comprise the 6N phase space of traditional statistical mechanics 
plus the 3N information space entailed by molecular biology.  The new space (to 
be called the “gnergy space”) composed of these two subspaces would have 9N 
dimensions as indicated in Eq. (1).  This equation also makes contact with the 
concepts of  synchronic and diachronic informations discussed in [2, Section 
4.5].  It was suggested therein that the traditional 6N-dimensional phase space 
deals with  the synchronic information and hence was referred to as the 
Synchronic Space while the 3N-dimensional information space is concerned with 
the consequences of history and evolution encoded in each particle and thus was 
referred to as the Diachronic Space.  The resulting space was called the gnergy 
space (since it encodes not only energy but also information).

   Gnergy Space =  6N-D Phase Space  +  3N-D  Information Space 
   (1)
(Synchronic Space)   
(Diachronic Space)

The study of both energy and information was defined as “info-statistical 
mechanics” in 2012 [2, pp. 102-106, 297-301].  The Planckian information of the 
second kind, IPS, [5] was defined as the negative of the binary logarithm of 
the skewness of the long-tailed histogram that fits the Planckian Distribution 
Equation (PDE) [6].   In Table 1, the Planckian information is compared to the 
Boltzmann entropy in the context of the complexity theory of Weaver [8]. The 
inseparable relation between energy and information that underlies 
“info-statistical mechanics” may be expressed by the following aphorism:
“Information without energy is useless;
Energy without information is valueless.”

Table 1.  A comparison between Planckian Information (of the second kind) and 
Boltzmann entropy.  Adopted from [6, Table 8.3].

Order

Disorder

IPS = - log2 [(µ - mode)/σ]

(2008-2018)

S = k log W

(1872-75)

Planckian Information

Boltzmann entropy [7]

Organized Complexity [8]

Disorganized Complexity [8]

Info-Statistical Mechanics [2, pp. 102-106]

Statistical Mechanics [1]



References:
   [1] Tolman, R. C. (1979). The Principles of Statistical Mechanics,  Dover 
Publications, Inc.,
New York, pp. 42-46.
   [2] Ji, S. (2012) Molecular Theory of the Living Cell: Concepts, Molecular 
Mechanisms, and
Biomedical Applications.  Springer, New York.
   [3] Bellomo, N., Bellouquid, A. and Harrero, M. A. (2007).  From m