Re: [Fis] If always n>0 why we need log

2018-06-03 Thread Sungchul Ji
Hi Krassimir,

I think the main reason that we express 'information'  as a logarithmic 
function of the number of choices available, n, may be because the human brain 
finds it easier to remember (and communicate and reason with)  10 than  
100, or 100 than 10. . . . 0, etc.

All the best.

Sung




From: Krassimir Markov 
Sent: Sunday, June 3, 2018 12:06 PM
To: Foundation of Information Science
Cc: Sungchul Ji
Subject: If always n>0 why we need log

Dear Sung,

A simple question:

If always n>0 why we need log in

I = -log_2(m/n) = - log_2 (m) + log_2(n)   (1)

Friendly greetings

Krassimir


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] If always n>0 why we need log

2018-06-03 Thread Sungchul Ji
Hi Krassimir,


I think the main reason that we express 'information'  as a logarithmic 
function of the number of choices, n, may be because the human brain finds it 
easier to remember (and communicate and reason with)  10 than  100, or 
100 than 10. . . . 0, etc.


All the best.


Sung






From: Krassimir Markov 
Sent: Sunday, June 3, 2018 12:06:54 PM
To: Foundation of Information Science
Cc: Sungchul Ji
Subject: If always n>0 why we need log

Dear Sung,

A simple question:

If always n>0 why we need log in

I = -log_2(m/n) = - log_2 (m) + log_2(n)   (1)

Friendly greetings

Krassimir


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] The information-entropy relation clarified: The New Jerseyator

2018-06-03 Thread Sungchul Ji
Hi FISers,


I found a typo in the legend to Figure 1 in my last post:  ".. . .  without 
energy dissipation, no energy, . . ." shoud read

"Without energy dissipation, no information."   
(4).


In fact, Statement (4) may be fundamental to informatics in general so that it 
may be referred to as the "First Principle of Informatics" (FPI).


If this conjecture is correct, FPI may apply to the controversioal 
interpretations of the wavefunction of a material system (WFMS), since WFMS is 
supposed to encode all the information we have about the material system under 
consideration and hence implicates "information".  It thus seems to me that the 
complete interpretation of a wavefucntion, according to FPI, must specify the 
selection process as well, i.e., the free energy-dissipating step, which I am 
tempted to identified with "measurement" , "quantum jump", or "wavefunction 
collapse".


I am not a quantum mechanician, so it is possible that I have committed some 
logical errors somewhere in my arguemnt above.  If you diectect any, please let 
me know.


With All the best.


Sung


From: Fis  on behalf of Sungchul Ji 

Sent: Sunday, June 3, 2018 12:13:11 AM
To: 'fis'
Subject: [Fis] The information-entropy relation clarified: The New Jerseyator


Hi  FISers,


One simple (and may be too simple) way to distinguish between information and 
entropy may be as follows:


(i)  Define  information (I) as in Eq. (1)


  I = -log_2(m/n) = - log_2 (m) + log_2(n)  
 (1)


where n is the number of all possible choices (also called variety) and m is 
the actual choices made or selected.


(ii) Define the negative binary logarithm of n, i.e., -log_2 (n), as the 
'variety' of all possible choices  and hence identical with Shannon entropy H, 
as suggested by Wicken [1].  Then Eq. (1) can be re-writtens as Eq. (2):


   I = - log_2(m) - H   
(2)


(iii) It is evident that when m = 1 (i.e., when only one is chosen out of all 
the variety of choices available) , Eq. (2) reduces to Eq. (3):


I = - H 
 (3)


(iv) As is well known, Eq. (3) is the basis for the so-called the "negentropy 
priniciple of Information" frist advocated by Shroedinger followed by 
Brillouin,and others.  But Eq. (3) is clearly not a principle but a special 
case of Eq. (2)  with m = 1.


(v)  In conlcusion, I claim that information and negative entropry are not the 
same qualitatively nor quantiatively (except when m = 1 in Eq. (2)) and 
represent two opposite nodes of a fundamental triad [2]:








 Selection

H 
>  I
  (uncertainty before selection)
 (Uncertainty after selection)






Figure 1.  The New Jerseyator model of information (NMI) [3].  Since selection 
requires free energy dissipation, NMI implicates both information and energy.  
That is, without energy dissipation, no energy, and hence NMI may be viewed as 
a self-organizing process (also called dissipative structure) or an ‘-ator’.  
Also NMI is consistent with “uncertainty reduction model of information.”



With all the best.


Sung


P.s.  There are experimetnal evidences that informattion and entropy are 
orthogonal, thus giving rise to the Planck-Shannon plane that has been shown to 
distiguish between cancer and healthy cell mRNA levels.  I will discus this in 
n a later post.



References:

   [1]  Wicken, J. S. (1987).  Entropy and information: suggestions for common 
language. Phil. Sci. 54: 176=193.
   [2] Burgin, M (2010).  Theory of Information: Funadamentality, Diversity, 
and Unification.  World Scientific Publishing, New Jersey,

   [3] Ji, S. (2018).  The Cell Langauge theory: Connecting Mind and Matter.  
World Scientific Publishing, New Jersey.  Figure 10.24.
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] The information-entropy relation clarified: The New Jerseyator

2018-06-02 Thread Sungchul Ji
Hi  FISers,


One simple (and may be too simple) way to distinguish between information and 
entropy may be as follows:


(i)  Define  information (I) as in Eq. (1)


  I = -log_2(m/n) = - log_2 (m) + log_2(n)  
 (1)


where n is the number of all possible choices (also called variety) and m is 
the actual choices made or selected.


(ii) Define the negative binary logarithm of n, i.e., -log_2 (n), as the 
'variety' of all possible choices  and hence identical with Shannon entropy H, 
as suggested by Wicken [1].  Then Eq. (1) can be re-writtens as Eq. (2):


   I = - log_2(m) - H   
(2)


(iii) It is evident that when m = 1 (i.e., when only one is chosen out of all 
the variety of choices available) , Eq. (2) reduces to Eq. (3):


I = - H 
 (3)


(iv) As is well known, Eq. (3) is the basis for the so-called the "negentropy 
priniciple of Information" frist advocated by Shroedinger followed by 
Brillouin,and others.  But Eq. (3) is clearly not a principle but a special 
case of Eq. (2)  with m = 1.


(v)  In conlcusion, I claim that information and negative entropry are not the 
same qualitatively nor quantiatively (except when m = 1 in Eq. (2)) and 
represent two opposite nodes of a fundamental triad [2]:







 Selection

H 
>  I
  (uncertainty before selection)
 (Uncertainty after selection)





Figure 1.  The New Jerseyator model of information (NMI) [3].  Since selection 
requires free energy dissipation, NMI implicates both information and energy.  
That is, without energy dissipation, no energy, and hence NMI may be viewed as 
a self-organizing process (also called dissipative structure) or an ‘-ator’.  
Also NMI is consistent with “uncertainty reduction model of information.”



With all the best.


Sung


P.s.  There are experimetnal evidences that informattion and entropy are 
orthogonal, thus giving rise to the Planck-Shannon plane that has been shown to 
distiguish between cancer and healthy cell mRNA levels.  I will discus this in 
n a later post.



References:

   [1]  Wicken, J. S. (1987).  Entropy and information: suggestions for common 
language. Phil. Sci. 54: 176=193.
   [2] Burgin, M (2010).  Theory of Information: Funadamentality, Diversity, 
and Unification.  World Scientific Publishing, New Jersey,

   [3] Ji, S. (2018).  The Cell Langauge theory: Connecting Mind and Matter.  
World Scientific Publishing, New Jersey.  Figure 10.24.
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Sound-induced Faraday waves in water droplets: The Effects of System Sizes

2018-05-31 Thread Sungchul Ji
Hi FISers,

About 6 months ago, John Stuart Reid [1] of the Sonic Age Lab in Cumbria, 
England, published on-line a fascinating video strip showing the sound 
vibration-induced formation of standing waves in individual water droplets of 
50 to 100 microns in size [1], almost comparable to living cells, which is 
reproduced below (click the picture to activate the video).

https://youtu.be/Z0St42jfgMU
[https://i.ytimg.com/vi/Z0St42jfgMU/hqdefault.jpg]<https://youtu.be/Z0St42jfgMU>

Sessile drop experiment<https://youtu.be/Z0St42jfgMU>
youtu.be
In this short video we see a field of sessile drops, many of them in the 50 to 
100 micron range, mimicking the mass of many types of human cell. The sound 
us...

The standing waves formed within these small water droplets are the examples of 
the Faraday waves first reported in 1831 [2]. In the following explanation, 
John invokes the resonance mechanism to account for the differential effects of 
the same sound input on the wave patterns exhibited by differently sized sessle 
droplets, which I think is valid:
"In this short video we see a field of sessile drops, many of them in the 50 to 
100 micron range, mimicking the mass of many types of human cell. The sound 
used to excite the drops is code 133 of Cyma Technologies AMI 1000 sound 
therapy device. The entire field is around 4 mm in width yet the uptake of 
acoustic energy is significantly different between the various sizes of 
microscopic sessile drops, and at the point of Faraday Instability only two 
droplets reach full expression, while in others there is a very reduced 
acoustic uptake. This suggests that resonance may play a major role in the 
ability of cells to absorb acoustic energy."
The reason I am interested in the Faraday waves in sessile droplets is because 
I saw the link between these waves and the waves that I postulated to be 
induced by energy input in all the material systems in the Universe, from atoms 
to enzymes, cells, brains, human societies, and to the Universe Itself, 
depending on the pattern of which the functions of a given system is thought to 
be determined [3, 4].  This idea is schematically represented in Figure 1 
reproduced from [3, 4]:

[cid:3ba4a188-c858-4818-ab73-4e0a020409e5]
Figure 1.  One possibility to account for the universality of the Planckian 
distribution in nature is to postulate that the wave-particle duality first 
discovered in atomic physics operates at all scales of material systems, from 
atoms to the Universe. Reproduced from [2, 3].

Figure 1 can readily accommodate the sound-induced Faraday waves in sessile 
water droplets captured by the CymaScope simply by adding a 10^th arrow 
directed to "10. Faraday waves in sessile droplets".

According to this interpretation, the sound-induced Faraday waves formed in 
sessile water droplets as visualized  the CymaScope obey and embody the 
principle of wave-particle duality (PWPD) and hence I predict that the digital 
CymaScopic images of these droplets should fit PDE, the Planckian Distribution 
Equation, y = (A/(x + B)^5/(Exp (C/(x + B)) - 1), where x is the signal 
intensity of the CymaScopic image pixels, and y is their frequency.  If this 
prediction proves to be validated, the phenomenon of the sound-induced Faraday 
waves in sessile water droplets visualized by the CymaScope may be considered 
as one of the simplest mesoscopic material system in which PWPD is proven to 
operate, thus opening up the possibility that PWPD may also operate in living 
cells and their component biopolymers as I suggested in the abstract to the 
2017 Biophysical Society Annual Meeting [6] which is reproduced below:

"261-Pos Board B26
Protein Folding as a Resonance Phenomenon, with Folding Free Energies 
Determined by Protein-Hydration Shell Interactions   Sungchul Ji. Pharmacology 
and Toxicology, Rutgers University, Kendall Park, NJ, USA.

The single-molecule enzyme-turnover-time histogram of cholesterol oxidase [1] 
resembles the blackbody radiation spectrum at 4000 K. This observation 
motivated the author to generalize the Planck radiation equation (PRE), Sl = 
(8phc/l5 )/(ehc/lkT 1), by replacing the universal constants and temperature by 
free parameters, resulting in the Planckian Distribution Equation (PDE), y = 
(A/(x þ B)5 )/(eC/(x þ B) 1) [2]. Since the first factor in PRE reflects the 
number of standing waves generated in the blackbody and the second factor the 
average energy of the standing waves [3], it was postulated that any material 
system that generates data fitting PDE can be interpreted as implicating 
standing waves with associated average energies [2]. PDE has been found to fit 
the long-tailed histogram of the folding free-energy changes measured from 
4,300 proteins isolated from E. coli [4]. One possible interpretation of this 
finding is (i) that proteins (P) and their hydration shells (HS) are organized 
systems of oscillators with unique sets of natur

Re: [Fis] Are there 3 kinds of motions in physics and biology?

2018-05-08 Thread Sungchul Ji
Hi Michel,


Thank you for your informative comments and helpful suggestions in your earlier 
post (which I happened to have deleted by accident).  In any case I have a copy 
of the post so I can answer your questions raised therein.


(1)  I am defining the Planckian information, I_P, as the information required 
to transform a symmetric, Gaussian-like equation (GLE), into the Planckian 
distribution.  which is the Gaussian distribution with the pre-exponential 
factor replaced with a free parameter, A,   i.e., y = A*exp(-(\m - x)^2/2\s^2), 
which was found to overlap with PDE (Planckian Distribution Equation) in the 
rising phase.  So far we have two different ways of quantifying I_P: (i) the 
Plamck informaiton of the fist kind, i_PF = log_2 [AUC(PDE)/AUC(GLE)], where 
AUC is the area under the curve, and (ii) the Planckian information of the 
second kind, I_PS = -log_2[(\m -mode)/ \s], which applies to right-skewed 
long-tailed histograms only.  To make it apply also to the left-skewed 
long-tailed histograms, it would be necessary to replace (\m - mode) with its 
absolute value, i.e., |\m - mode|.


(2)  There can be more than two kinds of Planckian information, including what 
may be called the Planckian information of the third kind, i.e., I_PT = - 
long_2 (\chi), as you suggest.  (By the way, how do you define \chi ?).


(3)  The definition of Planckian information given in (1) implies that  I_P is 
associated with asymmetric distribution generated by distorting the symmetric 
Gaussian-like distribution by transforming the x coordinate non-linearly while 
keeping the y-coordinate of the Gaussian distribution invariant [1].




   GP   
definition
  Gaussian-like Distribution -> PDE 
> IP



Figure 1.  The definitions of the Gaussian process (GP) and the Planckian 
information (IP) based on PDE, Planckian Distribution Equation.  GP is the 
physicochemical process generating a long-tailed histogram fitting PDE.




(4)  I am assuming that the PDE-fitting asymmetric histograms will always have 
non-zero measures of asymetry.

(5)  I have shown in [1] that the human decision-making process is an example 
of the Planckian process that can be derived from a Gaussian distribution based 
on the drift-diffusion model well-known in the field of decision-making 
psychophysics.

Reference:
   [1] Ji, S. (2018).  The Cell Language theory: Connecting Mind and Matter.  
World Scientific Publishing, New Jersey.   Figure 8.7, p. 357.

All the best.

Sung





From: Fis  on behalf of Michel Petitjean 

Sent: Monday, May 7, 2018 2:05 PM
To: fis
Subject: Re: [Fis] Are there 3 kinds of motions in physics and biology?

Dear Karl,
In my reply to Sung I was dealing with the asymmetry of probability
distributions.
Probability distributions are presented on the Wikipedia page:
https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FProbability_distribution&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7C171407db4122453fe72a08d5b4465e1f%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C0%7C636613136684543650&sdata=mMWRW6FO6hrflqQRGhXtoTkhDqt0FTspjtT9YGgNn2c%3D&reserved=0
Don't read all this page, the beginning should suffice.
Then, the skewness is explained on an other wiki page:
https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FSkewness&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7C171407db4122453fe72a08d5b4465e1f%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C0%7C636613136684543650&sdata=HQh0OOxgyE5fXZMMEfZF6mG5S0yKOPNjoEPO%2FNo28rA%3D&reserved=0
Possibly the content of these two pages is unclear for you.
In order to avoid a huge of long and non necessary explanations, you
may tell me what you already know about probability distributions and
what was unclear from my post, then I can explain more efficiently.
However, I let Sung explain about his own post :)
Best regards,
Michel.

2018-05-07 19:55 GMT+02:00 Michel Petitjean :
> Dear Karl,
> Yes I can hear you.
> About symmetry, I shall soon send you an explaining email, privately, because 
> I do not want to bother the FISers with long explanations (unless I am 
> required to do it).
> However, I confess that many posts that I receive from the FIS list are very 
> hard to read, and often I do not understand their deep content :)
> In fact, that should not be shocking: few people are able to read texts from 
> very diverse fields (as it occurs in the FIS forum), and I am not one of them.
> Even the post of Sung was unclear for me, and it is exactly why I asked him 
> questions, but only on the points that I may have a chance to understand (may 
> be).
> Best regards,
> Michel.
>
___
Fis mailing list
Fis@listas.unizar.es
https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Flistas.unizar.es%2Fcgi-bin%2Fmailman%2

[Fis] Are there 3 kinds of motions in physics and biology?

2018-05-06 Thread Sungchul Ji
Hi FISers,

I think information and energy are inseparable in reality.  Hence to understand 
what information is, it may be helpful to understand what energy (and the 
associated concept of motion) is.  In this spirit, I am forwarding the 
following email that I wrote motivated by the lecture given by Dr. Grossberg 
this afternoon at the 119th Statistical Mechanics Conference.  In Table 1 in 
the email, I divided particle motions studied in physics and biology into three 
classes -- (i) random, (ii) passive, and (iii) active, and identified the field 
of specialization wherein these motions are studied as (i) statistical 
mechanics, (ii) stochastic mechanics, and (iii) info-statistical mechanics.  
The last term was coined by me in 2012  in [1].  I will be presenting a short 
talk (5 minutes) on Info-statistical mechanics on Wednesday, May 9, at the 
above meeting.   The abstract of the short talk is given below:

Short talk to be presented at the 119th Statistical Mechanics Conference, 
Rutgers University, Piscataway, N.J., May 6-9, 2018).

Planckian Information may be to Info-Statistical Mechanics what Boltzmann 
Entropy is to Statistical Mechanics.
Sungchul Ji, Department of Pharmacology and Toxicology, Ernest Mario School of 
Pharmacy, Rutgers University, Piscataway, N.J. 08854
Traditionally, the dynamics of any N-particle systems in statistical mechanics 
is completely described in terms of the 6-dimensional phase space consisting of 
the 3N positional coordinates and 3N momenta, where N is the number of 
particles in the system [1]. Unlike the particles dealt with in statistical 
mechanics which are featureless and shapeless, the particles in biology have 
characteristic shapes and internal structures that determine their biological 
properties.  The particles in physics are completely described in terms of 
energy and matter in the phase space but the description of the particles in 
living systems require not only the energy and matter of the particle but also 
their genetic information, consistent with the information-energy 
complementarity (or gnergy) postulate discussed in [2, Section 2.3.2].  Thus, 
it seems necessary to expand the dimensionality of the traditional phase space 
to accommodate the information dimension, which includes the three coordinates 
encoding the amount (in bits), meaning (e.g., recognizability), and value 
(e.g., practical effects) of information [2, Section 4.3]. Similar views were 
expressed by Bellomo et al. [3] and Mamontov et al. [4].  The expanded “phase 
space” would comprise the 6N phase space of traditional statistical mechanics 
plus the 3N information space entailed by molecular biology.  The new space (to 
be called the “gnergy space”) composed of these two subspaces would have 9N 
dimensions as indicated in Eq. (1).  This equation also makes contact with the 
concepts of  synchronic and diachronic informations discussed in [2, Section 
4.5].  It was suggested therein that the traditional 6N-dimensional phase space 
deals with  the synchronic information and hence was referred to as the 
Synchronic Space while the 3N-dimensional information space is concerned with 
the consequences of history and evolution encoded in each particle and thus was 
referred to as the Diachronic Space.  The resulting space was called the gnergy 
space (since it encodes not only energy but also information).

   Gnergy Space =  6N-D Phase Space  +  3N-D  Information Space 
   (1)
(Synchronic Space)   
(Diachronic Space)

The study of both energy and information was defined as “info-statistical 
mechanics” in 2012 [2, pp. 102-106, 297-301].  The Planckian information of the 
second kind, IPS, [5] was defined as the negative of the binary logarithm of 
the skewness of the long-tailed histogram that fits the Planckian Distribution 
Equation (PDE) [6].   In Table 1, the Planckian information is compared to the 
Boltzmann entropy in the context of the complexity theory of Weaver [8]. The 
inseparable relation between energy and information that underlies 
“info-statistical mechanics” may be expressed by the following aphorism:
“Information without energy is useless;
Energy without information is valueless.”

Table 1.  A comparison between Planckian Information (of the second kind) and 
Boltzmann entropy.  Adopted from [6, Table 8.3].

Order

Disorder

IPS = - log2 [(µ - mode)/σ]

(2008-2018)

S = k log W

(1872-75)

Planckian Information

Boltzmann entropy [7]

Organized Complexity [8]

Disorganized Complexity [8]

Info-Statistical Mechanics [2, pp. 102-106]

Statistical Mechanics [1]



References:
   [1] Tolman, R. C. (1979). The Principles of Statistical Mechanics,  Dover 
Publications, Inc.,
New York, pp. 42-46.
   [2] Ji, S. (2012) Molecular Theory of the Living Cell: Concepts, Molecular 
Mechanisms, and
Biomedical Applications.  Springer, New York.
   [3] Bellomo, N., Bellouquid, A. and Harrero, M. A. (2007).  From

Re: [Fis] Fw: The 'Shirasawa phenomenon' or the 'Shirasawa effect"

2018-05-06 Thread Sungchul Ji
Hi Karl,


Thanks for your comment.


According to N. Bohr, there are two kinds of opposites, A and B -- (i) 
supplementarity wherein A and B adds up to make the whole (e.g., the 
forest-tree pair), and  (ii) complementarity wherein A or B is the whole, 
depending on how the whole is observed (e.g., light as either wave or particle 
depending on how it is measured).  I can send you the reference if needed.


Sung


From: karl javorszky 
Sent: Friday, May 4, 2018 2:50:50 PM
To: Sungchul Ji
Cc: Stanley N. Salthe; fis
Subject: Re: [Fis] Fw: The 'Shirasawa phenomenon' or the 'Shirasawa effect"

Dear Sung,

Very encouraging the discussion of the difficulties human perception poses 
while trying to consolidate opposites.

The existence of the mental image is built on contrasts, so no wonder we find 
it hard to get a good grip on the mechanisms at work consolidating 
contradictions.

To the opposites we work on :

tree vs. forest,
top vs. bottom,
little vs. big,

could we also add:

background vs. foreground,
across the flow vs. along the flow of time,
commutative vs. sequenced?

If so, there appear some encouraging hints, that a rational methodology has 
been found to consolidate opposites.

Karl

Sungchul Ji mailto:s...@pharmacy.rutgers.edu>> 
schrieb am Do., 3. Mai 2018 18:01:

Hi Stan,


True.  Our brain seems to have many limitations, one of which is our inability 
to see the forest and the trees simultaneously.


It is interesting to note that we cannot measure (or at least not easy to 
measure) particles and waves of quons  (or quantum objects) simultaneously 
either,  although there are occasional claims asserting otherwise. Here we have 
two entities, A and B, that are not compositionally related (i.e., A is not a 
part of B) as are trees and the forest, but "complementarily" related (i.e., 
A^B, read A or B, depending on measurement) and hence does not involve any 
hierarchy.


All the best.


Sung


From: Fis mailto:fis-boun...@listas.unizar.es>> 
on behalf of Stanley N Salthe 
mailto:ssal...@binghamton.edu>>
Sent: Sunday, April 29, 2018 9:49 AM
To: fis
Subject: Re: [Fis] Fw: The 'Shirasawa phenomenon' or the 'Shirasawa effect"

Sung -- regarding:

The reason epigenetics (defined here as the process of inheritance without 
imlplicating any changes in the nucleotide sequences of DNA)  was not mentioned 
in my previous post is because I was mainly interested in the bottom-up (from 
micro to macro) mechanism of genetics, not the top-down (from macro to micro) 
mechanism.  It is interesting to note that our brain seems unable to handle 
both bottom-up and top-down mechanisms simultaneously, perhaps it may have 
something to do with the fact that we have two brain hemispheres (Yin and Yang) 
but only one vocal cord (the Dao).

It is interesting that I early realized the difficulty many folks have with 
visualizing at one time both the top-down AND bottom-up aspects of the 
compositional hierarchy:
[large scale constraints -> [activity in focus <- [small scale 
affordances]]]

Perhaps your suggestion is involved here as well!

STAN

On Sat, Apr 28, 2018 at 5:17 PM, Sungchul Ji 
mailto:s...@pharmacy.rutgers.edu>> wrote:

Hi Arthur and  FISers,

Thank you for asking an important question. The reason epigenetics (defined 
here as the process of inheritance without imlplicating any changes in the 
nucleotide sequences of DNA)  was not mentioned in my previous post is because 
I was mainly interested in the bottom-up (from micro to macro) mechanism of 
genetics, not the top-down (from macro to micro) mechanism.  It is interesting 
to note that our brain seems unable to handle both bottom-up and top-down 
mechanisms simultaneously, perhaps it may have something to do with the fact 
that we have two brain hemispheres (Yin and Yang) but only one vocal cord (the 
Dao).

One way to integrate the bottom-up and top-down mechanisms underlying genetic 
phenomenon may be to invoke the principle of vibrational resonance -- to view 
both the micro-scale DNA and  the macro-scale environment of organisms as 
vibrational systems or systems of oscillators that can exchange information and 
energy through the well-known mechanisms of resonance (e.g., the resonance 
between the oscillatory motions of the swing and the arms of the mother; both 
motions must have same frequencies. otherwise the child will not swing).  
According to the Fourier theorem, any oscillatory motions of DNA including very 
low frequencies can be generated by linear combinations of  very fast covalent 
bond vibrations in  DNA and  hence can be coupled to slow oscillatory motions 
of the environment, e.g., musical sounds. If this view is correct, music can 
affect, DIRECTLY (i.e., unmediated by the auditory system of the brain), the 
molecular motions of DNA in every cell in our body.  In other wor

Re: [Fis] Fw: The 'Shirasawa phenomenon' or the 'Shirasawa effect"

2018-05-03 Thread Sungchul Ji
Hi Stan,


True.  Our brain seems to have many limitations, one of which is our inability 
to see the forest and the trees simultaneously.


It is interesting to note that we cannot measure (or at least not easy to 
measure) particles and waves of quons  (or quantum objects) simultaneously 
either,  although there are occasional claims asserting otherwise. Here we have 
two entities, A and B, that are not compositionally related (i.e., A is not a 
part of B) as are trees and the forest, but "complementarily" related (i.e., 
A^B, read A or B, depending on measurement) and hence does not involve any 
hierarchy.


All the best.


Sung


From: Fis  on behalf of Stanley N Salthe 

Sent: Sunday, April 29, 2018 9:49 AM
To: fis
Subject: Re: [Fis] Fw: The 'Shirasawa phenomenon' or the 'Shirasawa effect"

Sung -- regarding:

The reason epigenetics (defined here as the process of inheritance without 
imlplicating any changes in the nucleotide sequences of DNA)  was not mentioned 
in my previous post is because I was mainly interested in the bottom-up (from 
micro to macro) mechanism of genetics, not the top-down (from macro to micro) 
mechanism.  It is interesting to note that our brain seems unable to handle 
both bottom-up and top-down mechanisms simultaneously, perhaps it may have 
something to do with the fact that we have two brain hemispheres (Yin and Yang) 
but only one vocal cord (the Dao).

It is interesting that I early realized the difficulty many folks have with 
visualizing at one time both the top-down AND bottom-up aspects of the 
compositional hierarchy:
[large scale constraints -> [activity in focus <- [small scale 
affordances]]]

Perhaps your suggestion is involved here as well!

STAN

On Sat, Apr 28, 2018 at 5:17 PM, Sungchul Ji 
mailto:s...@pharmacy.rutgers.edu>> wrote:

Hi Arthur and  FISers,

Thank you for asking an important question. The reason epigenetics (defined 
here as the process of inheritance without imlplicating any changes in the 
nucleotide sequences of DNA)  was not mentioned in my previous post is because 
I was mainly interested in the bottom-up (from micro to macro) mechanism of 
genetics, not the top-down (from macro to micro) mechanism.  It is interesting 
to note that our brain seems unable to handle both bottom-up and top-down 
mechanisms simultaneously, perhaps it may have something to do with the fact 
that we have two brain hemispheres (Yin and Yang) but only one vocal cord (the 
Dao).

One way to integrate the bottom-up and top-down mechanisms underlying genetic 
phenomenon may be to invoke the principle of vibrational resonance -- to view 
both the micro-scale DNA and  the macro-scale environment of organisms as 
vibrational systems or systems of oscillators that can exchange information and 
energy through the well-known mechanisms of resonance (e.g., the resonance 
between the oscillatory motions of the swing and the arms of the mother; both 
motions must have same frequencies. otherwise the child will not swing).  
According to the Fourier theorem, any oscillatory motions of DNA including very 
low frequencies can be generated by linear combinations of  very fast covalent 
bond vibrations in  DNA and  hence can be coupled to slow oscillatory motions 
of the environment, e.g., musical sounds. If this view is correct, music can 
affect, DIRECTLY (i.e., unmediated by the auditory system of the brain), the 
molecular motions of DNA in every cell in our body.  In other words, we can 
hear music not only through our ears but also through our whole body including 
blood.  Because of the patent  issue, I cannot reveal the experimental evidence 
supporting this claim, but, indue course, I hope to share with you the 
scientific evidence we obtained recently.

In conclusion, it may be that  the yin-yang doctrine of the Daoist philosophy 
(or any other equivalent principles) applies here, since molecular genetics and 
epigenetics may constitute  the irreconcilable opposites:

"Genetics has two complementary aspects -- molecular genetics and epigenetics."

"Molecular genetics and epigenetics are the complementary aspects of genetics."

"Genetic phenomena can be accounted for in two irreconcilably opposite manner 
with equal validity -- through the bottom-up (or reductionistic) or the 
top-down  (or holistic) approaches."

The last statement would help avoid many wasteful debates in the field of 
genetics.

 If you have any questions or corrections, please let me know.

Sung












From: Arthur Wist mailto:arthur.w...@gmail.com>>
Sent: Friday, April 27, 2018 6:48 PM
To: Sungchul Ji; FIS FIS
Cc: sbur...@proteomics.rutgers.edu<mailto:sbur...@proteomics.rutgers.edu>; 
Sergey Petoukhov; ole2001@med.cornell; 
dani...@shirasawa-acl.net<mailto:dani...@shirasawa-acl.net>; Sungchul Ji; 
x...@chemistry.harvard.edu&

Re: [Fis] Fw: The 'Shirasawa phenomenon' or the 'Shirasawa effect"

2018-04-28 Thread Sungchul Ji
Hi Arthur and  FISers,

Thank you for asking an important question. The reason epigenetics (defined 
here as the process of inheritance without imlplicating any changes in the 
nucleotide sequences of DNA)  was not mentioned in my previous post is because 
I was mainly interested in the bottom-up (from micro to macro) mechanism of 
genetics, not the top-down (from macro to micro) mechanism.  It is interesting 
to note that our brain seems unable to handle both bottom-up and top-down 
mechanisms simultaneously, perhaps it may have something to do with the fact 
that we have two brain hemispheres (Yin and Yang) but only one vocal cord (the 
Dao).

One way to integrate the bottom-up and top-down mechanisms underlying genetic 
phenomenon may be to invoke the principle of vibrational resonance -- to view 
both the micro-scale DNA and  the macro-scale environment of organisms as 
vibrational systems or systems of oscillators that can exchange information and 
energy through the well-known mechanisms of resonance (e.g., the resonance 
between the oscillatory motions of the swing and the arms of the mother; both 
motions must have same frequencies. otherwise the child will not swing).  
According to the Fourier theorem, any oscillatory motions of DNA including very 
low frequencies can be generated by linear combinations of  very fast covalent 
bond vibrations in  DNA and  hence can be coupled to slow oscillatory motions 
of the environment, e.g., musical sounds. If this view is correct, music can 
affect, DIRECTLY (i.e., unmediated by the auditory system of the brain), the 
molecular motions of DNA in every cell in our body.  In other words, we can 
hear music not only through our ears but also through our whole body including 
blood.  Because of the patent  issue, I cannot reveal the experimental evidence 
supporting this claim, but, indue course, I hope to share with you the 
scientific evidence we obtained recently.

In conclusion, it may be that  the yin-yang doctrine of the Daoist philosophy 
(or any other equivalent principles) applies here, since molecular genetics and 
epigenetics may constitute  the irreconcilable opposites:

"Genetics has two complementary aspects -- molecular genetics and epigenetics."

"Molecular genetics and epigenetics are the complementary aspects of genetics."

"Genetic phenomena can be accounted for in two irreconcilably opposite manner 
with equal validity -- through the bottom-up (or reductionistic) or the 
top-down  (or holistic) approaches."

The last statement would help avoid many wasteful debates in the field of 
genetics.

 If you have any questions or corrections, please let me know.

Sung












From: Arthur Wist 
Sent: Friday, April 27, 2018 6:48 PM
To: Sungchul Ji; FIS FIS
Cc: sbur...@proteomics.rutgers.edu; Sergey Petoukhov; ole2001@med.cornell; 
dani...@shirasawa-acl.net; Sungchul Ji; x...@chemistry.harvard.edu; 
n...@princeton.edu
Subject: Re: [Fis] Fw: The 'Shirasawa phenomenon' or the 'Shirasawa effect"

Hi,

Just a short note to first of all say thank you, I've find this very
helpful to know albeit I can't point to a direct application. Secondly
however, I do wonder: Why & how come you neglected to - in either an
inclusionary or exclusionary manner - address any potential epigenetic
mechanisms?

Kind regards,


Arthur

On 20 April 2018 at 19:32, Sungchul Ji  wrote:
> Hi,
>
>
> I am forwarding a slightly modified version of my previous post with the
> same title which was rejected by the FIS list due to the heavy attachments.
> The most significant addition is written in green.  The removed attachments
> are now replaced by their web addresses from which they can be downloaded
> free of charge.
>
>
> Best.
>
>
> Sung
>
> 
> From: Sungchul Ji
> Sent: Thursday, April 19, 2018 11:02 AM
> To: FIS FIS
> Cc: Sergey Petoukhov; dani...@shirasawa-acl.net; John Stuart Reid; sayer ji;
> sji.confor...@gmail.com; x...@chemistry.harvard.edu;
> sbur...@proteomics.rutgers.edu; n...@princeton.edu
> Subject: The 'Shirasawa phenomenon' or the 'Shirasawa effect"
>
>
> Hi FISers,
>
>
> In 2003, Takuji Shirasawa and his coworkers [1] found that mutating certain
> amino acids in the hemoglobin molecule (Hb) in mice produced the following
> effects:
>
> (1) increase O_2 consumption and CO_2 production,
>
> (2) the conversion of the muscle histology from a fast glycolytic to a fast
> oxidative type,
>
> (3) a mild anemia, and
>
> (4) faster running speed.
>
>
> In other words, Shirasawa et al provided a concrete example of molecular
> changes (e.g., amino acid mutations in Hb)  leading to (or associated with)
> macroscopic physiological changes in whole animals (e.g., anemia,  running
> behavior, etc

[Fis] Fw: The 'Shirasawa phenomenon' or the 'Shirasawa effect"

2018-04-20 Thread Sungchul Ji
Hi,


I am forwarding a slightly modified version of my previous post with the same 
title which was rejected by the FIS list due to the heavy attachments. The most 
significant addition is written in green.  The removed attachments are now 
replaced by their web addresses from which they can be downloaded free of 
charge.


Best.


Sung


From: Sungchul Ji
Sent: Thursday, April 19, 2018 11:02 AM
To: FIS FIS
Cc: Sergey Petoukhov; dani...@shirasawa-acl.net; John Stuart Reid; sayer ji; 
sji.confor...@gmail.com; x...@chemistry.harvard.edu; 
sbur...@proteomics.rutgers.edu; n...@princeton.edu
Subject: The 'Shirasawa phenomenon' or the 'Shirasawa effect"


Hi FISers,


In 2003, Takuji Shirasawa and his coworkers [1] found that mutating certain 
amino acids in the hemoglobin molecule (Hb) in mice produced the following 
effects:


(1) increase O_2 consumption and CO_2 production,

(2) the conversion of the muscle histology from a fast glycolytic to a fast 
oxidative type,

(3) a mild anemia, and

(4) faster running speed.


In other words, Shirasawa et al provided a concrete example of molecular 
changes (e.g., amino acid mutations in Hb)  leading to (or associated with) 
macroscopic physiological changes in whole animals (e.g., anemia,  running 
behavior, etc.).  For the convenience of discussions, I am taking the liberty 
of referring to this finding as the "Shirasawa et al. phenomenon/effect" or, 
more briefly, the "Shirasawa phenomenon/effect" which may be viewed as the 
macroscopic version of the Bohr effect [2].


The 'Shirasawa phenomenon/effect' is not limited to hemoglobin.  There are now 
many similar phenomena found in the fields of voltage-gated ion channels, i.e., 
molecular changes in the amino acid sequences of ion channel proteins leading 
to (or associated with) macroscopic effects on the human body called diseases 
[3].


Although the current tendency among practicing molecular biologists and 
biophysicists would be to explain away what is here called the Shirasawa 
phenomenon in terms of the microscopic changes "causing" the macroscopic 
phenomenon in a 1:1 basis, another possibility is that the microscopic changes 
"cause" a set of other microscopic changes at the DNA molecular level which in 
turn cause a set of macroscopic changes", in a many-to-many fashion.


Current trend:  Microscopic change (Hb mutation) ->  Macroscopic change 
1 (Oxygen affinity change of blood) -> Macroscopic change 2 (O_2 
metabolism<http://7769domain.com/Ad/GoIEx2/?token=RURoMlVLRFhYRytKQUovU21uTjVyMlExUVA0eEoyK29icGtYMENQa1BIbnBDQlpxMlR1Slk5Nk5mUTQweVdFRXl3VmtBMTZKZGtGL1lLcmowWWpLYXp2dkdiYkxLR1k3UEE3OGQwWXRpbGxObkdOOXB1a0RxQjZCZkJ2OFFQTjJHMk85T0VrSEV0YTlHM3VVOS9HbGVFeWJ3ME0rc01VcXZZekpiQmZ2YjRFN040bjRiWkFRbmtBQUhvWWtYS1F60>,
 anemia, running behavior)



Althernative mechanism:  Microscopic change 1 (Hb mutation) ---> 
Microscopic change 2 (Changes in the standing waves in DNA) ---> Multiple 
macroscopic changes (O_2 metabolism, anemia, muscle cell histological changes).


The alternative mechanism proposed here seems to me to be more consistent with 
the newly emerging models of molecular genetics [4] and single-molecule 
enzymology [5, 6].



Since the 'Shirasawa phenomenon/effect' evidently implicates information 
transfer from the microscale to the macroscale, it may be of interest to many 
information theoreticians in this group.   If you have any questions, comments, 
or suggestions, please let me know.


All the best.


Sung



References:

   [1] Shirasawa, T., et al. (2003).  Oxygen Affinity of Hemoglboin Regulaters 
O_2 Comsumtion, Metabolism, and Physical Activity.  J. Biol. Chem. 278(7): 
5035-5043.  PDF at http://www.jbc.org/content/278/7/5035.full.pdf

   [2] The Bohr effect.  https://en.wikipedia.org/wiki/Bohr_effect
   [3] Huang W, Liu M, S Yan F, Yan N. (2017).  Structure-based assessment of 
disease-related mutations in human voltage-gated sodium 
channels.<https://molbio.princeton.edu/publications/structure-based-assessment-disease-related-mutations-human-voltage-gated-sodium>
 Protein Cell. 8(6):401-438. PDF at https://www.ncbi.nlm.nih.gov/pubmed/28150151

   [4] Petoukhov, S. V. (2016).  The system-resonance approach in modeling 
genetic structures. BioSystems 139: 1–11. PDF at 
https://www.sciencedirect.com/science/article/pii/S0303264715001732

   [5] Lu, H. P., Xun, L. and Xie, X. S. (1998) Single-Molecule Enzymatic 
Dynamics. Science 282:1877-1882.  PDF at 
http://www.jbc.org/content/274/23/15967.short
   [6] Ji, S. (2017). RASER Model of Single-Molecule Enzyme Catalysis and Its 
Application to the Ribosome Structure and Function. Arch Mol. Med & Gen 1:104. 
PDF at http://hendun.org/journals/AMMG/PDF/AMMG-18-1-104.pdf






___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] A Paradox

2018-02-26 Thread Sungchul Ji
Hi FISers,


I am not sure whether I am a Procrustes (bed) or a Peirce (triadomaniac), but I 
cannot help but seeing an ITR (irreducible Triadic Relation) among Text, 
Context and Meaning, as depicted in Figure 1.


  f 
  g

Context  >  Text   
->  Meaning

|   
^

|   
|
|   
|

|_|


h



“The meaning of a text is irreducibly dependent on its context.”



 “Text, context, and meaning are irreducibly triadic.”   The “TCM principle” (?)



Figure 1.  The Procrustean bed, the Peircean triadomaniac, or both ?

f =  Sign production;  g =  Sign interpretation;  h = Correlation or 
information flow.


According to this 'Peircean/Procrustesian' diagram, both what Terry said and 
what Xueshan said may be valid.  Although their thinking must have been 
irreducibly triadic (if Peirce is right), Terry may have focused on (or 
prescinded) Steps f and h, while Xueshan prescinded Steps g and h, although he 
did indicate that his discussion was limited to the context of human 
information and human meaning (i.e., Step  f).  Or maybe there are many other 
interpretations possible, depending on the interpreter of the posts under 
discussion and the ITR diagram.

There are an infinite number of examples of algebraic operations: 2+3 = 5, 3 - 
1 = 2, 20 x 45 = 900, etc., etc.
If I say "2 + 3 = 5", someone may say, but you missed "20 x 45 = 900".  In 
other words, no matter what specific algebraic operation I may come up with, my 
opponent can always succeed in coming up with an example I missed.   The only 
solution to such an end-less debate would be to discover the axioms of algebra, 
at which level, there cannot be any debate.  When I took an abstract algebra 
course as an undergraduate at the University of Minnesota, Duluth, in 1962-5, I 
could not believe that underlying all the complicated algebraic calculations 
possible, there are only 5 axioms 
(https://www.quora.com/What-is-the-difference-between-the-5-basic-axioms-of-algebra).

So can it be that there are the axioms (either symbolic,  diagrammatic, or 
both) of information science waiting to be discovered, which will end all the 
heated debates on information, meaning, data, etc. ?

All the best.

Sung


From: Fis  on behalf of Terrence W. DEACON 

Sent: Monday, February 26, 2018 1:13 PM
To: Xueshan Yan
Cc: FIS Group
Subject: Re: [Fis] A Paradox

It is so easy to get into a muddle mixing technical uses of a term with 
colloquial uses, and add a dash of philosophy and discipline-specific 
terminology and it becomes mental quicksand. Terms like 'information' and 
'meaning" easily lead us into these sorts of confusions because they have so 
many context-sensitive and pardigm-specific uses. This is well exhibited in 
these FIS discusions, and is a common problem in many interdisciplinary 
discussions. I have regularly requested that contributors to FIS try to label 
which paradigm they are using to define their use of the term "information' in 
these posts, but sometimes, like fish unaware that they are in water, one 
forgets that there can be alternative paradigms (such as the one Søren 
suggests).

So to try and avoid overly technical usage can you be specific about what you 
intend to denote with these terms.
E.g. for the term "information" are you referring to statisitica features 
intrinsic to the character string with respect to possible alternatives, or 
what an interpreter might infer that this English sentence refers to, or 
whether this reference carries use value or special significance for such an 
interpreter?
And e.g. for the term 'meaning' are you referring to what a semantician would 
consider its underlying lexical structure, or whether the sentence makes any 
sense, or refers to anything in the world, or how it might impact some reader?
Depending how you specify your uses your paradox will become irresolvable or 
dissolve.

— Terry

On Mon, Feb 26, 2018 at 1:47 AM, Xueshan Yan 
mailto:y...@pku.edu.cn>> wrote:

Dear colleagues,

In my teaching career of Information Science, I was often puzzled by the 
following inference, I call it Paradox of Meaning and Information or Armenia 
Paradox. In order not to produce unnecessary ambiguity, I state it below and 
strictly limit our discussion within the human context.



Suppose an earthquake occurred in Armenia last night and all of the main media 
of the world have given the 

[Fis] ITR (Irreducible Triadic Relation) in Plato's Cave Allegory (PCA)

2018-02-24 Thread Sungchul Ji
Hi FISers,


(1) Perhaps this has been discussed already either on this lsit or elsewhere, 
i.e, the possible connection between the Irreducible Triadic Relation (ITR) and 
Plato's Allegory of the Cave (PAC).  Regardless, I would like to propose below 
my own version of the relation between ITR and PAC.



  [cid:5ffda82e-0661-4afd-8e02-6119cbe799d2]

https://philosophyzer.wordpress.com/2012/09/21/the-allegory-of-the-cave-by-plato-summary-and-meaning/


(2) We can represent ITR diagrammatically as shown in the legend to Table 1 
which is also known as the commutative triangle in category theory. As I 
summarized in the main body of Table 1, the 3 nodes (A, B, & C) and 3 edges (f, 
g, & h) of the ITR diagram have specific examples in three different systems -- 
(i) Plato's cave, (ii) natural science, and (iii) semiotics.



Table 1.  The Irreducible Triadic Relation (ITR) in Plato’s cave, science , and 
semiotics.

   f   g

  A  > B  > C

   | ^
   |  |

   |_|

h


Agent


A


B


C


f


g


h


Plato’s cavemen


Form
(or Real world)


Shadows

on the wall


Thought


Physical laws


Perception


Universal causality


Scientists


4-D structure of enzymes


3-D structure of enzymes


Scientific model


measurement


Interpretation


Correlation


semioticians


Object


Sign


Interpretant


Sign production


Sign interpretation


Grounding Correlation





(3) I am assuming that Plato’s cavemen divide into two groups -- (i) the common 
cavemen who think (and believe) that the shadows are real, and (ii) the 
enlightened cavemen whose intellect distinghishes between shadows and the real 
objects casting them.  It may be justified to describe the difference between 
the way the common cavemen think and the way the enlightened cavemen think as 
the ‘dyadic thinking’ (i.e, Step f reversed, or  A <- B only) and ‘the 
triadic thinking’ (i.e., the entire commutative triangle), respectively.


(4) I liken to the common cavemen defined above the many briliant and 
hardworking scientists who believe that studying the static 3-dimensional 
structures of all the enzymes and proteins in the mitochondrion will eventually 
solve the mystery of how the organell works in living cells. In this sense, I 
believe that Plato's allegory of the cave applies to contemproary science.


(5) If it can be validated that there are indeed two types of Plato's cavemen 
and in modern science, it may also apply to information science, leading to the 
prediction that there will be two kinds of information scientists -- (i) dyadic 
thinking information scientis and (ii) and triadic thinking information 
scientis.


(6) Finally, One interesting spinoff of Table 1 may be that its Column A sheds 
new light on what Plato might have meant by his Form or Idea, thus contributing 
to solving the Plato-Aristotle debate on the Form-Matter dualtiy.


Any questions or commens are welcome as always.


Sung


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] The unification of the theories of information based on the cateogry theory

2018-02-08 Thread Sungchul Ji
Hi Terry,  and FISers,


Can it be that "language metaphor" is akin to a (theoretical) knife that, in 
the hands of a surgeon, can save lives but, in a wrong hand, can kill?


All the best.


Sung


From: Francesco Rizzo <13francesco.ri...@gmail.com>
Sent: Thursday, February 8, 2018 2:56:11 AM
To: Terrence W. DEACON
Cc: Fis,; Sungchul Ji
Subject: Re: [Fis] The unification of the theories of information based on the 
cateogry theory

Caro Terry estensibile a tutti,
è sempre un piacere leggerTi e capirTi. La  general theory of information è 
preceduta da un sistema (o semiotica) di significazione e seguita da un sistema 
(o semiotica ) di comunicazione. Tranne che quando si ha un processo 
comunicativo come il passaggio di un Segnale (che non significa necessariamente 
'un segno') da una Fonte, attraverso un  Trasmettitore, lungo un Canale, a un 
Destinatario. In un processo tra macchina e macchina il segnale non ha alcun 
potere 'significante'. In tal caso non si ha significazione anche se si può 
dire che si ha passaggio di informazione. Quando il destinatario è un essere 
umano (e non è necessario che la fonte sia anch'essa un essere umano) si è in 
presenza di un processo di significazione. Un sistema di significazione è una 
costruzione semiotica autonoma, indipendente da ogni possibile atto di 
comunicazione che l'attualizzi. Invece ogni processo di comunicazione tra 
esseri umani -- o tra ogni tipo di apparato o struttura 'intelligente, sia 
meccanico che biologico, -- presuppone un sistema di significazione come 
propria o specifica condizione. In conclusione, è possibile avere una semiotica 
della significazione indipendente da una semiotica della comunicazione; ma è 
impossibile stabilire una semiotica della comunicazione indipendente da una 
semiotica della significazione.
Ho appreso molto da Umberto Eco a cui ho dedicato il capitolo 10. Umberto Eco e 
il processo di re-interpretazione e re-incantamento della scienza economica 
(pp. 175-217) di "Valore e valutazioni. La scienza dell'economia o l'economia 
della scienza" (FrancoAngeli, Milano, 1997). Nello mio stesso libro si trovano:
- il capitolo 15. Semiotica economico-estimativa (pp. 327-361) che si colloca 
nel quadro di una teoria globale di tutti i sistemi di significazione e i 
processi di comunicazione;
- il sottoparagrafo 5.3.3 La psicologia genetica di Jean Piaget e la 
neurobiologia di Humberto Maturana e Francesco Varela. una nuova epistemologia 
sperimentale della qualità e dell'unicità (pp. 120-130).
Chiedo scusa a Tutti se Vi ho stancati o se ancora una volta il mio scrivere in 
lingua italiana Vi crea qualche problema. Penso che il dono che mi fate è, a 
proposito della QUALITA' e dell'UNICITA',  molto più grande del (per)dono che 
Vi chiedo. Grazie.
Un saluto affettuoso.
Francecso


2018-02-07 23:02 GMT+01:00 Terrence W. DEACON 
mailto:dea...@berkeley.edu>>:
Dear FISers,

In previous posts I have disparaged using language as the base model for 
building a general theory of information.
Though I realize that this may seem almost heretical, it is not a claim that 
all those who use linguistic analogies are wrong, only that it can be causally 
misleading.
I came to this view decades back in my research into the neurology and 
evolution of the human language capacity.
And it became an orgnizing theme in my 1997 book The Symbolic Species.
Early in the book I describe what I (and now other evolutionary biologists) 
have come to refer to as a "porcupine fallacy" in evolutionary thinking.
Though I use it to critique a misleading evolutionary taxonomizing tendency, I 
think it also applies to biosemiotic and information theoretic thinking as well.
So to exemplify my reasoning (with apologies for quoting myself) I append the 
following excerpt from the book.

"But there is a serious problem with using language as the model for analyzing 
other

species’ communication in hindsight. It leads us to treat every other form of 
communication as

exceptions to a rule based on the one most exceptional and divergent case. No 
analytic method

could be more perverse. Social communication has been around for as long as 
animals have

interacted and reproduced sexually. Vocal communication has been around at 
least as long as frogs

have croaked out their mating calls in the night air. Linguistic communication 
was an afterthought,

so to speak, a very recent and very idiosyncratic deviation from an ancient and 
well-established

mode of communicating. It cannot possibly provide an appropriate model against 
which to assess

other forms of communication. It is the rare exception, not the rule, and a 
quite anomalous

exception at that. It is a bit like categorizing birds’ wings with respect to 
the extent they possess or

lack the characteristics of penguins’ wings, or like analyzing the types of 
hair

[Fis] The unification of the theories of information based on the cateogry theory

2018-02-07 Thread Sungchul Ji
Hi  FISers,


On 10/8/2017, Terry wrote:


" So basically, I am advocating an effort to broaden our discussions and 
recognize that the term information applies in diverse ways to many different 
contexts. And because of this it is important to indicate the framing, whether 
physical, formal, biological, phenomenological, linguistic, etc.

. . . . . . The classic syntax-semantics-pragmatics distinction introduced by 
Charles Morris has often been cited in this respect, though it too is in my 
opinion too limited to the linguistic paradigm, and may be misleading when 
applied more broadly. I have suggested a parallel, less linguistic (and nested 
in Stan's subsumption sense) way of making the division: i.e. into intrinsic, 
referential, and normative analyses/properties of information."

I agree with Terry's concern about the often overused linguistic metaphor in 
defining "information".  Although the linguistic metaphor has its limitations 
(as all metaphors do), it nevertheless offers a unique advantage as well, for 
example, its well-established categories of functions (see the last column in 
Table 1.)

The main purpose of this post is to suggest that all the varied theories of 
information discussed on this list may be viewed as belonging to the same 
category of ITR (Irreducible Triadic Relation) diagrammatically represented as 
the 3-node closed network in the first column of Table 1.

Table 1.  The postulated universality of ITR (Irreducible Triadic Relation) as 
manifested in information theory, semiotics, cell language theory, and 
linguistics.

Category Theory

   fg
   A -> B --> C
|   ^
||
|__|
   h

ITR (Irreducible Triadic Relation)

Deacon’s theory of information

Shannon’s
Theory of
information

Peirce’s theory of signs

Cell language theory

Human language
(Function)

A

Intrinsic information

Source

Object

Nucleotides*/
Amion acids

Letters
(Building blocks)

B

Referential information

Message

Sign

Proteins

Words
(Denotation)

C

Normative information

Receiver

Interpretant

Metabolomes
(Totality of cell metabolism)

Systems of words
(Decision making & Reasoning)

f

?

Encoding

Sign production

Physical laws

Second articulation

g

?

Decoding

Sign interpretation

Evoutionary selection

First and Third articulation

h

?

Information flow

Information flow

Inheritance

Grounding/
Habit

Scale   Micro-Macro?Macro   Macro   Micro   Macro


*There may be more than one genetic alphabet of 4 nucleotides.  According to 
the "multiple genetic alphabet hypothesis', there are n genetic alphabets, each 
consisting of 4^n letters, each of which in turn consisting of n nucleotides.  
In this view, the classical genetic alphabet is just one example of the n 
alphabets, i.e., the one with n = 1.  When n = 3, for example, we have the 
so-called 3rd-order genetic alphabet with 4^3 = 64 letters each consisting of 3 
nucleotides, resulting in the familiar codon table.  Thus, the 64 genetic 
codons are not words as widely thought (including myself until recently) but 
letters!  It then follows that proteins are words and  metabolic pathways are 
sentences.  Finally, the transient network of metbolic pathways (referred to as 
"hyperstructures" by V. Norris in 1999 and as "hypermetabolic pathways" by me 
more recently) correspond to texts essential to represent 
arguement/reasoning/computing.  What is most exciting is the recent discovery 
in my lab at Rutgers that the so-called "Planck-Shannon plots" of mRNA levels 
in living cells can identify function-dependent "hypermetabolic pathways" 
underlying breast cancer before and after drug treatment (manuscript under 
review).

Any comments, questions, or suggestions would be welcome.

Sung

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Summing up: New Year Lecture

2018-01-24 Thread Sungchul Ji
Hi John, Pedro, and FISers


(1) I agree with John that there must exist a set of the principles, laws or 
concepts that apply universally, from molecules to cells to human brains to the 
cosmos.  But the millions dollar question is what are these  principles, laws 
and concepts?


(2)  I disagree that "chemiosmosis" of P. Mitchell is one of the principles 
underlying life's physiology.  I will delineate the reasons for my objecting 
the concept of chemiosmosis being a fundamental mechanism of energy 
transduction in living cells in a later post.  I recently discussed this topic 
in Section 3.3.3 in [1] entitled "Deconstructing the Chemiosmotic Model"


(3) I disagree that "negentropy", also called "negative entropy", is a 
fundamental concept in biology or physics.  I critiqued this concept in Section 
3.3.3 in [2].  To make a long story sort,  there are two meanings to the term 
"negentropy", one violating the Third Law of thermodynamics and the other not 
(see Table 1 below).


Table 1.  The dual meanings of the term “negentropy”



Negentropy

Negative Entropy
   (-S)

   Negative Entropy Change
   (- ΔS)

Third Law of thermodynamics

   is violated

 is not violated



(4)  The Second Law of thermodynamics when applied to an isolated system (i.e., 
a system that does not exchange energy nor matter with its environment) states 
that the entropy of the system increases with time or that the thermodynamic 
driving force of an isolated system is the increase in entropy:



ΔS = S_final - S_initial > 0.   
  (I)



But many scientists do not realize that Inequality (I) holds only for isolated 
systems and not for non-isolated systems such as living organisms (which are 
open systems, i.e., systems that exchange both energy and matter with their 
environment) or physical systems, e.g., refrigerators that are closed (i.e., 
exchange energy but not matter with their environment).


(5) For biological systems under constant pressure (P) and temperature (T), the 
driving force behind all spontaneous physicochemical changes occurring in them 
(e.g., respiration, morphogenesis) is a decrease in Gibbs free energy (ΔG) , 
which is a function of both energy and entropy:

   ΔG = ΔE + PΔV - TΔS  
(II)


As you can see here,  ΔG can be negative (thus driving all spontaneous living 
processes) without any negative entropy change (or negentropy), since ΔG <0 as 
long as (ΔE + PΔV) < - TΔS, regardless of whether ΔS is negative or positive.   
This demonstrates that negentropy cannot serve as a fundamental focces for 
life's physiology.


(6)








References:

[1] Ji, S. (2018).  "The Cell Language Theory: Connecting Mind and Matter", 
World Scientific Publishing, New Jersey.

[2] Ji, S. (2012). The Third Law of Thermodynamics and “Schroedinger’s 
Paradox”.  In:Molecular Theory of 
the Living Cell: Concepts, Molecular Mechanisms, and Biomedical Applications.  
Springer, New York.  pp. 12-15.  PDF at 
http://www.conformon.net/wp-content/uploads/2014/03/Schroedinger_paradox.pdf


From: Fis  on behalf of JOHN TORDAY 

Sent: Wednesday, January 24, 2018 9:33 AM
To: fis@listas.unizar.es
Subject: Re: [Fis] Summing up: New Year Lecture

Dear FIS colleagues, Pedro has pointed out some rookie errors in my post. You 
can find my paper "From cholesterol to consciousness" at 
https://www.ncbi.nlm.nih.gov/pubmed/28830682.
 Hopefully you have access to the paper without having to buy it. If you don't 
please 
email
 me at jtor...@ucla.edu and i will send you a copy. As 
for addressing consciousness at the cellular/molecular level, I understand that 
the mental 
health
 professionals have a problem with consciousness beyond the brain/mind. But I 
consider that anthropocentric. Just like every o

Re: [Fis] Response to Sungchul. Generative Logic

2018-01-14 Thread Sungchul Ji
Hi Soren,


Which comment is for me?


Also, I want to clarify the following:


(1) 'Semiotics' is the name given to the study of signs generally and existed 
since long before Peirce's time (1839-1914).

(2) If we represent 'semiotics' as a large circle, it will contain many small 
sub-circles representing various theories about sign processes, including 
Peirce's own, yours, mine, and many others', each sub-circles contributing to 
the complete description of the large circle.

(3) In this Venn diagrammatic sense, 'neo-semiotics' is a sub-circle belonging 
to the large circle of Semiotics that should have some overlap with the 
Peircean semiotics since it is an extension of the latter.  Further, 
neo-semiotics has many new features not contained in the Peircean semiotics 
(e.g., molecular signs and their mechanisms of action driven by free energy 
dissipation, the essential thermodynamic requirement for semiosis, and the 
relation between micro- and macrosemiotics, etc.) and hence cannot be 
completely contained within the sub-circle of the Peircean semiotics.


All the best.


Sung



From: Søren Brier 
Sent: Sunday, January 14, 2018 10:32 AM
To: Loet Leydesdorff; Joseph Brenner; Terrence W. DEACON; Alex Hankey; Fis,
Cc: Emanuel Diamant; Sungchul Ji
Subject: RE: [Fis] Response to Sungchul. Generative Logic


Dear Pedro



Their seems to be some malfunction in the system. Three comments – the last one 
to Sung – have not appeared on the list. Could you investigate?

 Best

  Søren Brier
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] I salute to Sungchul

2018-01-13 Thread Sungchul Ji
B3JRHkESNqHxOOhhjn9C%2F9%2FuJAFx6OX7%2FVs%3D&reserved=0>
   [5] Ji, S. (1997). Isomorphism between cell and human languages: molecualr 
biological, bioinformatic and linguistic implications. 
<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.conformon.net%2Fwp-content%2Fuploads%2F2012%2F05%2FIsomorphism1.pdf&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7Cd3662883d79442bc279b08d55a45ef3e%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C1%7C636514178788801408&sdata=RaGj9cNFqT4nLwPh%2BllpEgryPeou9Dm%2F6MZGEOXqA18%3D&reserved=0>
 BioSystems 44:17-39.  PDF at 
http://www.conformon.net/wp-content/uploads/2012/05/Isomorphism1.pdf<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.conformon.net%2Fwp-content%2Fuploads%2F2012%2F05%2FIsomorphism1.pdf&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7Cd3662883d79442bc279b08d55a45ef3e%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C1%7C636514178788801408&sdata=RaGj9cNFqT4nLwPh%2BllpEgryPeou9Dm%2F6MZGEOXqA18%3D&reserved=0>

[6] Ji, S. (2017).  The Cell Language Theory: Connecting Mind and Matter.  
World Scientific, New Jersey.  Chapter 5.









From: Alex Hankey 
Sent: Saturday, January 13, 2018 12:24 AM
To: Sungchul Ji
Cc: Emanuel Diamant; fis@listas.unizar.es
Subject: Re: [Fis] I salute to Sungchul

And what about the Kinds of Information that you cannot put in a data set?
The information that makes you turn your head and meet the gaze of someone 
staring at you.
No one could do that, which we humans and all animals do constantly,
unless we had received such information at a subliminal level in the brain.
We all have that capacity, it is vital for survival in the wild. All animals do 
it.
The 'Sense of Being Stared At' is a common experience for most animals,
how far down the tree of life no one yet knows.

Whatever triggers it is definitely 'A Difference that Makes a Difference',
so fits in your definition of 'Meaningful Information' - it has to!
BUT IT CANNOT BE DIGITAL INFORMATION.
Please Face Up to This Fact.

All best wishes,

Alex


On 13 January 2018 at 07:30, Sungchul Ji 
mailto:s...@pharmacy.rutgers.edu>> wrote:

Hi Emmanuel and FISers,


Thank you, Emmanuel, for your generous remarks.  It is heartening to know that 
our ideas converge, although we carried out our research independently of each 
other, a clear example of consilience.


(1)  I like and agree with the Kolomogorov quote you cited in [1]:


"Information is a linguistic description of structures in a given data set."

It seems to me that there are 4 key concepts embedded in the above quote, which 
we may view as the definition of what may be called the "Komogorov information" 
or the "Kolmogorov-Bateson information" for  the convenience of reference:

i)   data set (e.g., ACAGTCAACGGTCCAA)
ii)  linguistic description (e.g., Threonine, Valine, Asparagine, Glycine)
iii) structure (e.g., 16 mononucdotide, 8 dinucldotides, 5 trinucleotides plus 
1)
iv) mathematical description (e.g., tensor product of two 2x2 matrices of 4 
nucleotides) [2, 3].

The first three elements are obvious, but the 4th is not so obvious but 
justified in view of the recent work of Petoukhov [2, 3].

(2) Based on these ideas, I have constructed Table 1 below of the various names 
applied to the two kinds of information which I described as I(-) and I(+) in 
my previous post.







Table 1.  The arbitrariness of the signs referring to ‘information’. It doesn’t 
matter what you call it, as long as your chosen label refers to the right 
reality, thing, process, mechanisms, etc.


1


Type I Information


Type II information


2


Physical Information


Sematic information


3


Shannon information


Kolmogorov information, or

Kolmogorov-Bateson information


4


‘Meaningless’ information


‘Meaningful’ information


5


I(-) information, or simply I(-)


I(+) information, or simply I(+)


6


Quantitative information


Qualitative information


7


Mathematical information


Linguistic information (see Statement (1))


8


Formal information


Phenomenological information


9


Interpretant-less sign [4]


Triadic sign [4]




(3)  One practical application of the dual theory of information under 
discussion is in deducing the structure of cell language, or the structure of 
the linguistics of DNA, in a much more rigorous manner than was possible in 
1997 [5].

   It is the common practice in biology to use the terms "letters", "words", 
"sentences", and "texts" without any rigorous definitions.  The general rule is 
to follow the rules of concatenations used in linguistics literally and say that

i) just as 26 letters in the English alphabet are combined to form words (the 
process being called the second articulation [5]), so the 4 letters of the 
genetic alphabets, A, C, G and T/U,  combine in triplets to form genetic 
codons.  S

Re: [Fis] I salute to Sungchul

2018-01-12 Thread Sungchul Ji
Hi Emmanuel and FISers,


Thank you, Emmanuel, for your generous remarks.  It is heartening to know that 
our ideas converge, although we carried out our research independently of each 
other, a clear example of consilience.


(1)  I like and agree with the Kolomogorov quote you cited in [1]:


"Information is a linguistic description of structures in a given data set."

It seems to me that there are 4 key concepts embedded in the above quote, which 
we may view as the definition of what may be called the "Komogorov information" 
or the "Kolmogorov-Bateson information" for  the convenience of reference:

i)   data set (e.g., ACAGTCAACGGTCCAA)
ii)  linguistic description (e.g., Threonine, Valine, Asparagine, Glycine)
iii) structure (e.g., 16 mononucdotide, 8 dinucldotides, 5 trinucleotides plus 
1)
iv) mathematical description (e.g., tensor product of two 2x2 matrices of 4 
nucleotides) [2, 3].

The first three elements are obvious, but the 4th is not so obvious but 
justified in view of the recent work of Petoukhov [2, 3].

(2) Based on these ideas, I have constructed Table 1 below of the various names 
applied to the two kinds of information which I described as I(-) and I(+) in 
my previous post.




Table 1.  The arbitrariness of the signs referring to ‘information’. It doesn’t 
matter what you call it, as long as your chosen label refers to the right 
reality, thing, process, mechanisms, etc.

1

Type I Information

Type II information

2

Physical Information

Sematic information

3

Shannon information

Kolmogorov information, or
Kolmogorov-Bateson information

4

‘Meaningless’ information

‘Meaningful’ information

5

I(-) information, or simply I(-)

I(+) information, or simply I(+)

6

Quantitative information

Qualitative information

7

Mathematical information

Linguistic information (see Statement (1))

8

Formal information

Phenomenological information

9

Interpretant-less sign [4]

Triadic sign [4]


(3)  One practical application of the dual theory of information under 
discussion is in deducing the structure of cell language, or the structure of 
the linguistics of DNA, in a much more rigorous manner than was possible in 
1997 [5].
   It is the common practice in biology to use the terms "letters", "words", 
"sentences", and "texts" without any rigorous definitions.  The general rule is 
to follow the rules of concatenations used in linguistics literally and say that

i) just as 26 letters in the English alphabet are combined to form words (the 
process being called the second articulation [5]), so the 4 letters of the 
genetic alphabets, A, C, G and T/U,  combine in triplets to form genetic 
codons.  Similarly, just as words form sentences and sentences form texts by 
the same concatenation procedure (or tensor multiplication, mathematically 
speaking , i.e, linearly arranging words and sentences, respectively (see the 
second column in Table 2), so the 64 nucleotide triplets combine to form 
proteins and proteins combine to form metabolic pathways by continuing the 
concatenation process, or the tensor multiplication of matrices of larger and 
larger sizes (see the fourth column, which is based on the physical theory of 
information, i.e., without any involvement of semantics or the first 
articulation).

ii)   In contrast to the fourth column just described, we can justify an 
alternative structural assignments based on the semantic theory of information 
as shown in the fifth column of Table 2.  Here the letters of the cell language 
alphabet are not always mononucloetoides but thought to be n-nucleotides, such 
as dinucleotides (when n = 2), trinucleotides (when n =3), tetranucleotides 
(when n = 4), penta-nucelotides (when n = 5), etc.  That is, unlike in human 
language where the letters of an alphabet usually consist of one symbol, e.g., 
A, B, C, D, E, . . . , I am claiming that in cell language, the letters can be 
mononucloetides (i.e., A, G, C, T/U), dinucloeotides (i.e., AG, AC, . . . .) , 
trinucleotides (i.e., ACT, GTA,  . . . ), tetranucleotides (i.e., ACTG, CCGT, . 
. . .), pentanucleotides (i.e., ACCTG, TCGAT, . . .) and, up to n-nucleotides 
(also called n-plets [2, 3]), where n is an unknown number whose upper limit is 
not yet known (at least to me).  If this conjecture turns out to be true, then 
the size of the cell language alphabet can be much larger (10^3 - 10^9 ?) than 
the size of a typical human linguistic alphabet which is usually less than 
10^2, probably due to the limitation of the memory capacity of the human brain.

(iii) From linguistics, we learn that there are at least 4 levels of 
organization, each level characterized by a unique function (see the second 
column).  Without presenting any detailed argument, I just wish to suggest that 
the linguistic structures deduced based on the semantic information theory 
(i.e., the fifth column) agree with the human linguistic structures (i.e., the 
second column) better than does the linguistic structures bas

Re: [Fis] New Year Lecture

2018-01-11 Thread Sungchul Ji
Hi Pedro, John and other FISers,


(1)  Thank you John for the succinct summary of your cell-based evolutionary 
theory.  As I indicated offline, I too proposed a cell-based evolutionary 
theory in 2012 [1] and compared it with the gene-centered evolutionary theory 
of Zeldovich and Shankhnovich (see Table 14.10 in [1]).


(2) I agree with Pedro that

". . . ..  essential informational ideas are missing too, and this absence of 
the informational perspective in the ongoing evo discussions is not a good 
thing. . . . "


I often wonder if this situation has arisen in biology because biologists 
blindly apply to their problems the information theory as introduced by Shannon 
almost 7 decades ago in the context of communication engineering without due 
attention paid to the fact that  the Shannon-type information theory is not 
designed to handle the "meaning" or semantics of messages but only the AMOUNT 
of the information they carry.  If we agree that there are three essential 
aspects to information, i.e., amount (e.g., my USB stores 3 Megabytes of 
information), meaning (e.g., the nucleotide triplet, ACG, encodes threonine),  
and value (e.g., the same message, 'Yes', can have different monetary values, 
depending on the context), we can readily see that the kind of information 
theory most useful for biologists is not (only) the Shannon-type but (also) 
whatever type that can handle the semantics and pragmatics of information.


(3) One way to avoid the potential confusions in applying information theory to 
biology may be to recognize two distinct types of information which, for the 
lack of better terms, may be referred to as the "meaningless information" or 
I(-)  and "meaningful information" or I(+), and what Pedro regarded as the 
missing "essential informational ideas" above may be identified with I(+) (?)


(4)  There may be many forms of the I(+) theories to emerge in the field of 
"new informatics" in the coming decades.  Based on my research results obtained 
over the past two decades, I am emboldened to suggest that "linguistics" can be 
viewed as an example of the I(+) theory. The term "linguistics" was once 
fashionable in Western philosophy and humanities 
(https://en.wikipedia.org/wiki/Linguistic_turn) in the form of "linguistic 
turn" but apparently became outmoded (for some unknown reason to me, a 
non-philosopher), but I am one of the many (including Chargaff who discovered 
his two parity rules of DNA sequences; 
https://en.wikipedia.org/wiki/Chargaff%27s_rules) who believes that linguistics 
provide a valuable tool for elucidating the workings of living structures and 
processes [2, 3].  In fact we may refer to the emerging trend in the early 21st 
century that explore the basic relations between linguistics and biology as the 
"Linguistic Return", in analogy to the "Linguistic Turn" referring to the  
"major development in Western philosophy during the early 20th century, the 
most important characteristic of which is the focusing of philosophy and the 
other humanities primarily on the relationship between philosophy and 
language." ((https://en.wikipedia.org/wiki/Linguistic_turn)

(5)  So, linguistics played an important role in philosophy in the early 20th 
century and may play a similarly important role in biology in the coming 
decades of the 21st century.  What about physics?  Does physics need 
linguistics to solve their basic problems ?   If not linguistics, perhaps 
semiotics, the study of signs?  The latter possibility was suggested by Brian 
Josephson in his lecture

"Biological Organization as the True Foundation of Reality"


 given at the 66th Lindau Nobel Laureate Meeting held in Lindau, Germany, 
stating that

“Semiotics will eventually overtake quantum mechanics in the same way as 
quantum mechanics overtook classical physics.”

I referred to this statement as the "Josephson conjecture" in [3].  When I 
visited him in Cambridge last summer to discuss this statement, he did not 
object to his name being used in this manner.


(6)  If the concepts of the "Linguistic Return" in biology and the Josephson 
conjecture  in physics prove to be correct in the coming decades and centuries, 
it may be possible to conclude that philosophy, biology, and physics are 
finally united/integrated in the framework of semiotics viewed as a generalized 
linguistics.


All the best.


Sung








  [1] Ji, S. (2012).  The Zeldovich-Shakhnovich and MTLC Models of 
Evolution: From Sequences to 
Species.  In: Molecular Theory of the Living Cell: concepts, Molecular 
Mechanisms, and Biomedical Applications.  Sprinter, New York.  Pp. 509-519. PDF 
at http://www.conformon.net/model-of-evolution/
   [2] Ji, S. (2012).  The Isomorphism between Cell and Human Languages: The 
Cell Language Theory. In: 
Molecular Theory of the Living Cell: Concepts, Molecular Mechanisms, and 
Biomedical Applications.  Spring

Re: [Fis] Social information, Sociotype

2017-12-22 Thread Sungchul Ji
Hi Emanuel, Pedro & Fishers,


Emanuel is rasing an important point.  It is clear that the duality of genetic 
and epigenetic informations is fundamental.  But I think the triadicity of 
genotype, phenotype, and sociotype is as important and fundametnal, if we 
interpret the triadicity to mean that three cannot be reduced to two or one 
just as the duality means that two cannot be reduced to one.  The two 
relations, the fomer irreducibly dyadic and the latter irreducibly tiradic, may 
not be mutually exclusive or contradictory (as they may appear on the surface) 
but reflect different aspects of the information of livign systems -- one 
aspect is irreducibly dyadic and the other is irreducibly triadic, depending on 
how the phenomonon of life is viewed.   We may represent this idea 
diagrammatically as infigure 1 below, which is an elaboration and extension of 
the ITR (Irreducible Triadic Relation) diagram that I used in my previous post:






__

   |___|

   | k   | i
  |   |Epigenetic Information flows

   |  | 
   |   |||

   V  f  V g
|   |   i, j, & k

   Genotype  >  Phenotype  -> Sociotype

  |  ^  |   
^

  |  |  j | 
 |

  |  |__|  |
  ||Genetic 
Information flows

  h 
||


f, g, & h




Figure 1.  The duality of the genetic and epigenetic information flows in the 
genotype-phenotype-sociotype triad underlying living systems.



Let me know if you have any questions or suggestions.

Sung


From: Fis  on behalf of Emanuel Diamant 

Sent: Tuesday, December 19, 2017 3:25 PM
To: 'Pedro C. Marijuan'
Cc: fis@listas.unizar.es
Subject: [Fis] Social information, Sociotype




Dear Pedro, Dear Fises,



I apologize, as usual, for intervening in your respected discussion. But in my 
humble understanding, the genotype-phenotype-sociotype triad is simply a list 
of behavioral forms (or types). Evidently, they all are endorsed and inspired 
by the information that is in the disposal of every living being. But from 
information point of view (and we are busy with information essence quest) only 
two types of information are involved in a living being behavior production: 
genetic information and epigenetic information. That is: vertically exchanged 
inheritance information and horizontally exchanged experience information. 
These two types of information are responsible for the behavior of living 
beings at every known to us level of biological organization: single cell - 
genotype, cell assembly or organism - phenotype, groups of organisms or 
societies - sociotype.

Again, I apologize for invading your discussion, but we are busy with 
information, aren’t we?



Best regards, Emanuel.


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Comes the Revolution. The Real

2017-12-16 Thread Sungchul Ji
Hi Joseph,


Sorry for my mis-spelling your first name in my previous email.


Sung



From: Fis  on behalf of Joseph Brenner 

Sent: Saturday, December 16, 2017 11:56 AM
To: 'Søren Brier'; 'fis'
Subject: Re: [Fis] Comes the Revolution. The Real


Dear Søren,



Thank you for a positive and constructive note and question. Although I 
maintain my critique of Peirce’s tychism and synechism and his concepts of and 
manipulations of signs and diagrams, I have always seen value in many of his 
intuitions. I would be glad to consider him a ‘humanist with a semiotic 
worldview’. It takes all kinds . . .



I think for participants in this list to say what they mean by reality, exactly 
for, as you put it, a discussion of the ontology and science behind various 
informational paradigms, would be very useful. Pedro, what do you think? For me 
reality is change and stability, being and becoming, appearance and, 
contradictorially, the reality behind appearance. This is why standard logic 
doesn’t work.



Best Season’s Greetings,



Joseph



P.S. Perhaps a typo, but what is the sense of ‘treading’ in ‘treading 
processual concept’?









From: Søren Brier [mailto:sbr@cbs.dk]
Sent: samedi, 16 décembre 2017 13:28
To: Joseph Brenner; fis
Subject: RE: [Fis] Comes the Revolution



Dear Joseph



This very Peircean of you as “The challenge is to reconcile our roles as 
informational organisms and agents within nature and as stewards of nature.” is 
at the centre of Peirce’s thinking instead that he uses the treading processual 
concept of sign instead of information as his basic concept . I know that many 
call Peirce an objective idealist, although it is a form of realism I am not 
sure that this concept covers his combination of Tychism and synechism with a 
semiotic worldview. I think Peirce’s view is unique. But your mail does put the 
focus on the importance of discussion the ontology behind the various 
informational paradigms.

What do we mean when we use the term real for instance about Lupasco’s physical 
– biological – contradictorial information? As I understand the term has been 
pretty important for your view.



Best

   Søren



From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Joseph Brenner
Sent: 16. december 2017 10:15
To: fis 
Subject: [Fis] Comes the Revolution



Dear Pedro, Dear FISers,



I regret that I have difficulty in relating to the current FIS discussion, but 
that is my problem. I see little progress since the appearance of  Lupasco’s 
physical – biological – contradictorial information; Kauffman, Logan’s biotic 
and Ulanowicz’ apophatic information; Deacon’s Shannon – Boltzmann – Darwin 
information; and Wu’s revolution. Sungchul’s intuition of an “irreversible 
triadic relation” reflects the power of triads as cognitive attractors, but 
discussion is blocked by his use of the word ‘irreversible’, required by the 
underlying idealist Peircean structure of his argument.



What I would like to see is the foundations of information being discussed in 
relation to the real problems of society, beyond questionnaires. Some of these 
led yesterday to a prohibition of the use of seven words including foetus, 
diversity and science-based from certain U. S. Government documents. I think we 
need to have in the forefront of our minds the statement made by Floridi in his 
2010 book, Information. A Very Short Introduction (which all of you have read, 
of course): “The challenge is to reconcile our roles as informational organisms 
and agents within nature and as stewards of nature.”



I believe that such a perspective, placed as a criterion for selection of 
pertinent concepts, would make our discussions a lot deeper and more relevant.



Thank you and best wishes,



Joesph



[https://ipmcdn.avast.com/images/icons/icon-envelope-tick-round-orange-animated-no-repeat-v1.gif]


Garanti sans virus. 
www.avast.com



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Comes the Revolution. The Real

2017-12-16 Thread Sungchul Ji
Hi Joeph,


You wrote on December 16 as follows:


" Sungchul’s intuition of an “irreversible triadic relation” reflects the power 
of triads as cognitive attractors, but discussion is blocked by his use of the 
word ‘irreversible’, required by the underlying idealist Peircean structure of 
his argument."


Please correct "irreversible" to "irreducible".  This may have caused some 
confusions in your understanding of my argument and Peirce's.  The former is 
primarily a thermodynamic concept whereas the latter is a logical one, in my 
opinion.


Sung




From: Fis  on behalf of Joseph Brenner 

Sent: Saturday, December 16, 2017 11:56 AM
To: 'Søren Brier'; 'fis'
Subject: Re: [Fis] Comes the Revolution. The Real


Dear Søren,



Thank you for a positive and constructive note and question. Although I 
maintain my critique of Peirce’s tychism and synechism and his concepts of and 
manipulations of signs and diagrams, I have always seen value in many of his 
intuitions. I would be glad to consider him a ‘humanist with a semiotic 
worldview’. It takes all kinds . . .



I think for participants in this list to say what they mean by reality, exactly 
for, as you put it, a discussion of the ontology and science behind various 
informational paradigms, would be very useful. Pedro, what do you think? For me 
reality is change and stability, being and becoming, appearance and, 
contradictorially, the reality behind appearance. This is why standard logic 
doesn’t work.



Best Season’s Greetings,



Joseph



P.S. Perhaps a typo, but what is the sense of ‘treading’ in ‘treading 
processual concept’?









From: Søren Brier [mailto:sbr@cbs.dk]
Sent: samedi, 16 décembre 2017 13:28
To: Joseph Brenner; fis
Subject: RE: [Fis] Comes the Revolution



Dear Joseph



This very Peircean of you as “The challenge is to reconcile our roles as 
informational organisms and agents within nature and as stewards of nature.” is 
at the centre of Peirce’s thinking instead that he uses the treading processual 
concept of sign instead of information as his basic concept . I know that many 
call Peirce an objective idealist, although it is a form of realism I am not 
sure that this concept covers his combination of Tychism and synechism with a 
semiotic worldview. I think Peirce’s view is unique. But your mail does put the 
focus on the importance of discussion the ontology behind the various 
informational paradigms.

What do we mean when we use the term real for instance about Lupasco’s physical 
– biological – contradictorial information? As I understand the term has been 
pretty important for your view.



Best

   Søren



From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Joseph Brenner
Sent: 16. december 2017 10:15
To: fis 
Subject: [Fis] Comes the Revolution



Dear Pedro, Dear FISers,



I regret that I have difficulty in relating to the current FIS discussion, but 
that is my problem. I see little progress since the appearance of  Lupasco’s 
physical – biological – contradictorial information; Kauffman, Logan’s biotic 
and Ulanowicz’ apophatic information; Deacon’s Shannon – Boltzmann – Darwin 
information; and Wu’s revolution. Sungchul’s intuition of an “irreversible 
triadic relation” reflects the power of triads as cognitive attractors, but 
discussion is blocked by his use of the word ‘irreversible’, required by the 
underlying idealist Peircean structure of his argument.



What I would like to see is the foundations of information being discussed in 
relation to the real problems of society, beyond questionnaires. Some of these 
led yesterday to a prohibition of the use of seven words including foetus, 
diversity and science-based from certain U. S. Government documents. I think we 
need to have in the forefront of our minds the statement made by Floridi in his 
2010 book, Information. A Very Short Introduction (which all of you have read, 
of course): “The challenge is to reconcile our roles as informational organisms 
and agents within nature and as stewards of nature.”



I believe that such a perspective, placed as a criterion for selection of 
pertinent concepts, would make our discussions a lot deeper and more relevant.



Thank you and best wishes,



Joesph



[https://ipmcdn.avast.com/images/icons/icon-envelope-tick-round-orange-animated-no-repeat-v1.gif]


Garanti sans virus. 
www.avast.com

[Fis] Decoding the Cell Language: The role of PDE and its derivative, the Planck-Shannon plot, in identifying MOLECULAR TEXTS

2017-12-06 Thread Sungchul Ji
Hi FISers,


(1)  In 1997 [9], I defined the cell language as follows:


"Cell language is a self-organizing system of molecules, some of which

encode, act as signs for, or trigger, gene-directed cell processes."


So defined the cell language shares many qualitative similarities or principles 
with the human language as summarized in Tables 2 and 6-3 in my 11/27/2017 post 
to this list.  In contrast, Table 3 in the same post provides quantitative 
similarities between the two languages, since


(i) PDE, y = (A/(x + B)^5)/(Exp (C/(x + B)) -1) derived from physics and MAL, y 
= (Ax^-B)/Exp (Cx), derived from linguistics [15] have a similar mathematical 
form in that they are both the products of a power function and an exponential 
function, and

(ii) PDE and MAL are equivalent as far as their ability to fit long-tailed 
histograms regardless of whether they came from physics or linguistics.


These findings strongly indicate that there are structural features of cell and 
human languages that are similar in terms of their functions as inferred in 
Table 4.  Please note that the first two terms in the following are well 
established in linguistics and the third term was introduced in the cell 
language theory in 2012 [6]:



1st articulation = words forming sentences

2nd articulation = letters forming words
3rd articulation = sentences forming texts.



Table 4.  Isomorphism between cell and human languages deduced from a 
qualitative comparison between linguistics and cell biology [1, 2, 3, 4].


Cell Language


Human Language


Function


Alphabet


A, C, G, T/U


Elementary signs [5]


Words


Gene/mRNA/protein


Denotation


Sentences


Metabolic pathways


Decision making


Texts


Functionally related sets of metabolic pathways (FRMPs)


Logical reasoning or computing




(2)  What is the Planckian-Shannon plot (PSP) or the Planckian-Shannon space 
(PSS) ?


E hAs pointed out earlier on this list, PDE has been found to fit almost all 
long-tailed histograms we have analyzed so far that have been reported in the 
fields of atomic physics, molecular biology, cell biology, neurophysiology, 
psychology, glottometrics (also called quantitative linguistics), econometrics, 
cosmology [7-9], and most recently social network science [10].  The deviation 
of the asymmetric PDE from a symmetric curve such as the Gaussian distribution 
function [4, Figure 8.7] can be used as a measure of non-randomness, order, or 
information encoded in PDE [11].  There are two ways of quantifying the 
information content of PDE:

Plankian information of the first kind:   IPF = log2 [AUC (PDE)/AUC (GLE)]  
(1)

Plankian information of the second kind: IPS = - log2 [(\mu – mode)/(\sigma)]   
   (2)

where AUC = Area Under the Curve; GLE = Gaussian-Like Equation whose rising 
portion approximate closely the rising portion of PDE, and \mu and \sigma are, 
respectively, the mean and the standard deviation of the data set that can be 
represented as a long-tailed histogram fitting PDE.  In addition PDE allows us 
to calculate the associated Shannon entropy as


H = - \Sigma (pi log2 pi)   
  (3)


wh  where \Sigma is the sum over i from 1 to n, the number of data points, and 
pi is the probability of the occurrence of the ith data point.
   We have analyzed the mRNA level data of the arbitrarily selected 10 
metabolic pathways measured from human breast tissues using microarrays by 
Perou et al. [12]. These data sets all produced long-tailed histograms that 
fitted PDE, thus generating 10 pairs of the I_PS and H values. (We found that 
I_PS values are more reproducible than I_PF.)  When these 10 pairs of numbers 
were plotted in the so-called “Plank-Shannon space”, a linear correlation 
(called the Planck-Shannon plot) was found with the R^2 value of 0.686(see the 
upper panel of Figure 1).  Interestingly, when similarly sized 10 sets of the 
mRNA levels were selected from the the human transcriptome that have no known 
metabolic functions and plotted in the PSS, no correlation was found, the R^2 
value being 0.213, far less than 0.7, the minimum threshold for a significant 
correlation (see the lower panel of Figure 1).

[cid:a3e6c6be-7963-4aff-860c-4f025c88cd7b]


 (3)  Until just recently (fall, 2017), there has been no method to identify 
FRMPs although they were predicted to exist by the cell language theory.  It 
now seems that we have such a method in the form of the Planck-Shannon plots as 
exemplified in Figure 1.  In other words, when two sets of metabolic pathways 
are chosen that have 30 or more mRNA molecules in each pathway (so that a 
decent histogram can be obtained), one set encoding functions while  the other 
set having no metabolic functions, we see the difference in the correlation 
coefficients between their Planck-Shannon plots, indicating that the 
Planck-Shannon space (or the IPS vs. H plot) is capable of distingu

Re: [Fis] Is there a boundary between genetic informatics and genetics?

2017-12-06 Thread Sungchul Ji
Hi Xueshan and FISers,


Thanks for your generous comments.


(1)  You are probably right that  "Genetics is an information science, the 
first and most fully developed information science." It seems to be more real.


(2)  In Sung’s statement, imitating human linguistics of letters, words, 
sentences, texts, he divided the substrate or the media that carry genetic 
information into the following categories:


   1   2
3
A. C, G, T or U → genes/mRNA/proteins → metabolic pathways → functional 
networks of metabolic pathways

 ^  
  |
  | 
  |
  | 
  |

  ||
  4





. . .  Our bewilderment is: Is there a boundary between genetic informatics and 
genetics?


This is a million dollar question, as they say.  There can be more than one 
answers to this fundamental question depending on the perspective of the 
answerer.  My answer, based on the above  scheme or network (modified from the 
original one by adding the backward U-shaped arrow and four numbers, all in 
black) would be as follows:


   (i) The network is the complementary union of two aspects -- the formal and 
the material.  The study of the formal aspect of the network may be identified 
as an example of "informatics" and the study of the material aspect as 
"genetics" (which can be divided into molecular or microscopic genetics and 
macroscopic or classical genetics).

   (ii) The network has 4 nodes and 4 arrows, each of these 8 items or any 
combinations of them  can be studied as a specialized discipline, including the 
study of all of the items simultaneously, as I attempted to do in my 11/27/2017 
post to this list. I regard such a comprehensive (and ambitious) discipline as 
a part of what I came to refer to as "gnergetics", or the study ("-tics") of 
information ("gn-") and energy ("erg-") around 1985 [1].   In contrast,  the 
work of Petoukhov is primarily concerned with the mathematical underpinning of 
the molecular genetic structures which he has identified with tensor products 
of matrices [2].
   (iii) Based on the above considerations, my answer to the above question 
would two fold:

(a) There is no clear boundary in principle between genetic informatics and 
genetics.

(b) It may be convenient to distinguish between molecular genetics and  
classical genetics, the former being more closely related to informatics and 
the latter to genetics.


All the best.


Sung




References:
   [1] Ji, S. (2012).  
Complementarity.
  In: Molecular Theory of the Living Cell: Concepts, Molecular Mechanisms, and 
Biomedical Applications.  Springer, New York.  Section 2.3, pp. 24-50.  PDF at 
http://www.conformon.net/wp-content/uploads/2012/08/Excerpts_Chapters_2_complementarity_08192012.pdf
   [2]  See Ref. [12] in my 11/27/2017 post.




From: Fis  on behalf of Xueshan Yan 

Sent: Wednesday, December 6, 2017 10:28 AM
To: FIS Group
Subject: [Fis] Is there a boundary between genetic informatics and genetics?


Dear FIS Colleagues:

Last week, Sung and I discussed the problem of information in cell language and 
human language. Pedro gave his opinion too. I think the Sung’s work is very 
important to our information science study.

Biology is an informational science, this is the view of Leroy E. Hood of the 
Nobel 
prize
 winner of 2002. As to this argument, he didn't give a complete biological 
argument — only a genomics one. Review the history of those disciplines that 
claimed to be the member of information science in the past years, although we 
cannot wholly agree with them, for example, Bradley Efron — the former 
president of the American Statistical Association — thought: "Statistics is an 
information science, the first and most fully developed information science." 
But I believe, imitating the Efron's statement: "Genetics is an information 
science, the first and most fully developed information science." It seems to 
be mo

Re: [Fis] some notes

2017-11-17 Thread Sungchul Ji
Hi FISers,


I find it convenient to define communication as an irreducibly triadic process 
(physical, chemical, biological, physiological, or mental).  I identify such a 
triadic process with the Peircean semiosis (or the sign process) often 
represented as the following diagram which is isomorphic with the commutative 
triangle of the category theory.  Thus, to me, communication is a category:


   fg

A -->  B  ---> C
 |   ^
 |   |
 |__|
  h


Figure 1.  A diagrammatic representation of semiosis, sign process, or 
communication.  The names of the nodes and edges can vary depending on the 
communication system under consideration, which can be chemical reaction 
systems, gene expression mechanisms, human communication using symbols, 
computer systems using electrical signals.  If applied to the Shannon 
communication system, A = source, B = signals, C = receiver, f = encoding, g = 
decoding, and h = information transfer/flow.  When applied to human symbolic 
communicatioin, A = object, B = sign, C = interpretant, f = sign production, g 
= interpretation, and h = information flow.


One usefulness of Figure 1 is its ability to distinguish between "interactions" 
(see Steps f and g) and "communication" (see Steps f, g and h); the former is 
dyadic and the latter triadic.


All the best.


Sung



From: Fis  on behalf of Loet Leydesdorff 

Sent: Friday, November 17, 2017 8:06 AM
To: Pedro C. Marijuan; fis
Subject: Re: [Fis] some notes

Dear Pedro and colleagues,

2. Eigenvectors of communication. Taking the motif from Loet, and continuing 
with the above, could we say that the life cycle itself establishes the 
eigenvectors of communication? It is intriguing that maintenance, persistence, 
self-propagation are the essential motives of communication for whatever life 
entities (from bacteria to ourselves). With the complexity increase there 
appear new, more sophisticated directions, but the basic ones probably remain 
intact. What could be these essential directions of communication?
I am not so convinced that there is an a priori relation between life and 
communication. Communication is not alive. Non-living systems (e.g., computers, 
robots) also communicate. Perhaps, it matters for the communication whether the 
communicators are living systems; but this needs to be specified.

Communication studies is not biology. Perhaps, there is a specific biological 
communication as Maturana claims: when molecules are exchanged, one can expect 
life. Can one have life without communication? It seems to me that one can have 
communication without life. Communication would then be the broader category 
and life a special case.

Best,
Loet



3. About logics in the pre-science, Joseph is quite right demanding that 
discussion to accompany principles or basic problems. Actually principles, 
rules, theories, etc. are interconnected or should be by a logic (or several 
logics?) in order to give validity and coherence to the different combinations 
of elements. For instance, in the biomolecular realm there is a fascinating 
interplay of activation and inhibition among the participating molecular 
partners (enzymes and proteins) as active elements.  I am not aware that 
classical ideas from Jacob (La Logique du vivant) have been sufficiently 
continued; it is not about Crick's Central Dogma but about the logic of 
pathways, circuits, modules, etc. Probably both Torday and Ji have their own 
ideas about that-- I would be curious to hear from them.

4. I loved Michel's response to Arturo's challenge. I think that the two 
"zeros" I mentioned days ago (the unsolved themes around the cycle and around 
the observer) imply both multidisciplinary thinking and philosophical 
speculation...

Best wishes--Pedro

-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta 0
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 (& 6818)
pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Re: [Fis] TR: some notes

2017-11-13 Thread Sungchul Ji
Pedro wrote:


"3. About logics in the pre-science, Joseph is quite right demanding that
discussion to accompany principles or basic problems. Actually
principles, rules, theories, etc. are interconnected or should be by a
logic (or several logics?) in order to give validity and coherence to
the different combinations of elements. For instance, in the
biomolecular realm there is a fascinating interplay of activation and
inhibition among the participating molecular partners (enzymes and
proteins) as active elements.  I am not aware that classical ideas from
Jacob (La Logique du vivant) have been sufficiently continued; it is not
about Crick's Central Dogma but about the logic of pathways, circuits,
modules, etc. Probably both Torday and Ji have their own ideas about
that-- I would be curious to hear from them."


(1)  Enzymes, like all molecular and sub-molecular species (generally called 
microscopic entities, quantum objects, quons  [1], or wavicles) exhibit the 
wave-particle duality (as evidenced by the fact that they obey the Planckian 
Distribution Equation (PDE) [2-4]).  And yet most of the descriptions of enzyme 
mechanisms  given in the current literature are based on the particle aspect of 
enzymes including all the efforts directed to understanding enzyme activities 
in terns of the causal role of the static 1-dimensional sequences of amino 
acids or their 3-dimensional folds as revealed by the X-ray crystallography.  
Alternatively, we can describe enzyme  structure and function based on their 
wave attributes, in which case enzymes are viewed as systems of oscillators and 
their functions are determined by the collective vibrational motions of amino 
acid residues called "standing waves" (see Figure 8 in [3]).


(2)  Like electrons (see (4) below)), enzymes (and biopolymers in general, 
including DNA; see Table 1 below) may possesses  two complementary attributes 
-- static and dynamic.  Just as the position and momentum of the electron 
cannot be accounted for by their static attributes alone, so perhaps the static 
attributes of enzymes (e.g., amino acid sequences) alone may not be sufficient 
to account for their dynamic attributes, i.e., their catalytic activities.  The 
missing link may be sought in their wave attributes which have collective 
organizing power.  Traveling waves generated within a volume can interact to 
form "standing waves", also called "resonant waves", as exemplified by the 
Chladini plate shown below:


https://www.youtube.com/watch?v=wvJAgrUBF4w

[https://www.bing.com/th?id=OVP.0CuEeLYreRH1jrVSaq2sMwEsDh&pid=Api]

Amazing Resonance Experiment!
www.youtube.com
Add me on Facebook - (click LIKE on Facebook to add me) 
http://www.facebook.com/brusspup The song in the video is my latest song. You 
can find it on iTunes o...

(3)  There is accumulating evidence (references available upon request) to 
support the following mechanism of enzyme action:


  E   <--->  E' 
  (1)

 E'  +  S   <--->   [E.S 
<===> E.P]   (2)



  [E.S <===> E.P]   <--->   E + P   
  (3)

 

  E  +  S<--->   E  + P 
   (4)

Figure 1.  The pre-fit mechanism (in contrast to the better-known "induced-fit 
mechanism of Koshland) of enzyme catalysis [1].  Symbols are defined as 
follows: E = ground-state enzyme; E' = conformationally excited enzyme through 
thermal fluctuations; S = substrate; E.S = Enzyme-substrate complex in the 
transition state; E.P = the enzyme-product complex in the transition state; 
<-> = thermally equilibrium; <===> = the resonance hybrid between the 
enzyme-substrate and enzyme-product complexes.



Step (1) indicates that an enzyme molecule is a collection of  oscillators that 
interact with one another to form higher-order structures, either local or 
global, known as resonances or standing waves.  In the Chladni plate, what 
causes the 'visible' standing waves of particles  on it is the 'invisible' 
vibrational motions of the plate itself and the particles are forced to form 
standing waves through resonance energy transfer from the plate to individual 
particles.  In enzymes, what causes the formation of the standing waves or 
resonant waves of the enzyme molecule  are the component amino acid residues 
acting as elementary oscillators whose periodic motions can combine, obeying 
the Fourier theorem, to form almost infinite number of standing waves, some of 

Re: [Fis] Idealism and Materialism

2017-11-06 Thread Sungchul Ji
Hi Zueshan and FISers,


I think it is a good idea to distinguish between two 'complementary' branches 
of informatics -- (i) pure (or Unified, Fundamental, etc.) and (ii) applied (or 
Branch, Special, Specialized, etc).   This week I will post an example of the 
latter, using DNA as a model of defining what 'information' may be.  This post 
will be entitled:


Information in DNA: Letters, Words, Sentences, Texts, and Meanings


and will combine my earlier paper (The Lingusitics of DNA: Words, Stences, 
Grammar, Phonetics and Sematics, Ann. N. Y. Acad. Sci. 870:411-417 (1999)) with 
the results of a recent paper by Sergey Petoukhov (The rules of long 
DNA-sequences and tetra-groups of oligonucleotides,

https://scirate.com/arxiv/1709.04943).  My current hypothesis is that what 
Petoukhov refers to as the tetra-groups may be what I call molecular words.


All the best.


Sung
















From: Fis  on behalf of Xueshan Yan 

Sent: Sunday, November 5, 2017 10:11 PM
To: FIS Group
Subject: Re: [Fis] Idealism and Materialism


Dear Krassimir and Colleagues,



It has passed 70 years since Wiener’s Cybernetics and Shannon’s Mathematical 
Theory of Communication, and 20 years since this FIS forum found. During this 
period, there are innumerable researchers tried to find this "primary concept" 
of information but without success. Thing, knowledge, data, difference, 
perception, reflection, uncertainty, entropy, and so on. Therefore, I hope your 
"We need other primary concepts which will permit us to define information and 
to prove all consequences." won't arouse the enthusiasm trying to find it more, 
it is an endless task to such efforts. My opinion is that we had better turn 
our major attention to the new search such as the general principle that Pedro 
started several weeks ago, and theorem, axiom, etc. or other aspects about 
Information Science. Definition is not the only way to build a science. I agree 
with the view that Brenner regard Information Science as a pre-science, and 
only in this way we can slowly advance it into a normal-science, if possible.



Once we can put forward some basic knowledge such as the form of principle, 
theorem, axiom, etc., we need to illustrate them at once. This will immediately 
involves what you say: “Informatics lacks of well established primary concepts. 
The concept of information couldn't be primary because it couldn't be 
illustrated directly by real examples." In fact, standing on the position of 
Unified Information Science, all the general principle, theorem, axiom, etc. 
are very difficult to be illustrated, but they are rather easy to get the 
effective illustration in the Branch Informatics. In order to implement this 
strategy, the initial principle, theorem, axiom, etc. should also be based on 
Branch Informatics rather than Unified Information Science at first.



Best wishes,

Xueshan



From: fis-boun...@listas.unizar.es [mailto:fis-boun...@listas.unizar.es] On 
Behalf Of Krassimir Markov
Sent: Sunday, November 5, 2017 9:07 PM
To: Foundation of Information Science 
Subject: [Fis] Idealism and Materialism



Dear Bruno and FIS Colleagues,



Thank you very much for your useful remarks!



This week I was ill and couldn’t work.

Hope, the next week will be better for work.



Now I want only to paraphrase my post about Idealism and Materialism:



The first is founded on believing that the Intelligent Creation exists.



The second is founded on believing that the Intelligent Creation does not exist.



Both are kinds of religions because they could not prove their foundations by 
experiments and real examples.



The scientific approach does not believe in anything in advance. The primary 
concepts have to be illustrated by series of real examples. After that the 
secondary concepts have to be defined and all propositions have to be proved.



Are the mathematicians materialists or idealists?

Of course neither the first nor the second!



Mathematics is an example of the scientific approach.



Informatics lacks of well established primary concepts.

The concept of information couldn’t be primary because it couldn’t be 
illustrated directly by real examples.



We need other primary concepts which will permit us to define information and 
to prove all consequences.



Friendly greetings

Krassimir

















-Original Message-

From: Bruno Marchal

Sent: Sunday, November 05, 2017 12:30 PM

To: Foundation of Information Science

Subject: Re: [Fis] About 10 Principles







Dear Krassimir,





On 31 Oct 2017, at 15:07, Krassimir Markov wrote:



> Dear FIS Colleagues,

>

> Many years ago, in 2011, I had written a special remark about

> scientific and non-scientific approaches to try to understand the

> world around.

> The

> letter of Logan Streondj returns this theme as actual today.

>

> The interrelations between scientific and non-scientific creating and

> perceiving the data and models as well

Re: [Fis] mind-mind

2017-11-01 Thread Sungchul Ji
Hi Michell and FISers,


"Data is that what we see by using the eyes. Information is that what
we do not see by using the eyes, but we see by using the brain;
because it is the background to that what we see by using the eyes."


This paragraph contains the following pairs or relations:


Data ~ eyes

Information ~ brain


Since eyes cannot function without the brain but the brain can without eyes, I 
wonder if the above tetrad can be reduced to a triad:


Data ~ eyes/brain ~ information


which in turn may be explained in more detail using the ITR (Irreducible 
Triadic Relation) diagram thus:



   f  g 
(eyes/brain)

   Reality --> Sign  
--> Interpretant

|(Data) 
   ^

|   
|


|___|

   h (information)



 Figure 1.  The data-information relation explained on the basis of ITR 
(Irreudicible Triadic Relation).  The arrows read "determines" and 
"interpretant is the effect the sign has on the mind of the interpreter (biotic 
or abiotic).  f = measurement; g = mental process ; h = correspondence or 
information flow.





If you have any question or comments, let me know.


Sung





   f  g (eyes/brain)

  Reality >  Sign ---> 
Interpretant

   |  (Data)
   ^

   |
|

   |__|

  h (information)


 Figure 1.f = measurement or eyes; g = mental process or brain; h = 
correspondence or information flow





From: Fis  on behalf of Michel Petitjean 

Sent: Wednesday, November 1, 2017 5:29 PM
To: fis
Subject: Re: [Fis] mind-mind

Dear Krassimir, dear ALex, dear All,

I agree with Krassimir that ideas cannot be transmitted directly from
Mind to Mind.
Being a materialist, I consider that only matter exists.
Does it mean that information is matter or energy?
No.
Let me discuss about this contradiction.
Parenthesis: energy is linked to mass through math modeling of
physical laws, and mass is a property of matter (could also be linked
to a modeling concept, but it is unimportant here).
People (not only scientists) build math and non math models to attempt
to explain what they observe.
Would you consider that math is matter?
Probably no.
Thus math and non math models that we build in our heads are not matter.
However they are produced through some biochemical process, and as
such they originate from matter.
Eventually, it could be considered that math and other concepts are a
somewhat special part of matter, but I think that claim would not be
accepted in our current language(s).
I consider that "soul", "god", and some other concepts are built in our heads.
In my opinion, these concepts at best incoherent, if not worse.
Remark: I have nothing against religions, as far as believers do not
impose to me the consequences of their beliefs.
Religious beliefs must be private affairs.
Here, please accept my apologies if some of you are shocked by the
previous sentences.

Information is like math: it is a modeling concept applied to some situations.
However, I do not claim that information can be reduced to the math
concepts of information.

To conclude:

1. I agree with Principle 1 of Pedro.

2. I assume potential contradictions in my views. No problem: I am a
poor philosopher.
Then,I never claimed that I am "built" to be able to elaborate a
coherent theory about life, consciousness , etc. May be it is
impossible. May be that cannot be decided, etc.
All that is opinions. It is just nice and funny to discuss information
and so on.

3. If I would vote for a definition of information, I would retain the
one of Karl.
Citing Karl in his post of the 3 Oct 2017:
"Data is that what we see by using the eyes. Information is that what
we do not see by using the eyes, but we see by using the brain;
because it is the background to that what we see by using the eyes."

All my best,

Michel.

Michel Petitjean
MTi, INSERM UMR-S 973, University Paris 7,
35 rue Helene Brion, 75205 Paris Cedex 13, France.
Phone: +331 5727 8434; Fax: +331 5727 8372
E-mail: petitjean.chi...@gmail.com (preferred),
michel.petitj...@univ-paris-diderot.fr
https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fpetitjea

[Fis] Two kinds of the Planckian Information -- the First (I_PF)and the Second Kind (I_PS)

2017-10-29 Thread Sungchul Ji
Hi FISers,


(I)  In the post dated March 21, 2017, I attached a file entitled "What is the 
Planckian information ?".   The Planckian information (symbolized by I_P) is 
defined as the binary logarithm of the ratio between the area under the curve 
(AUC) of PDE (Planckian Distribution Equation; see Eqn (1) in the file) and 
that of GLE(Gaussian-like Equation; see Eqn (2) in the file):


I_P = log_2 (AUC(PDE)/AUC(GLE)) 
 (1)


PDE is the function for long-tailed histograms (both right and left long 
tailed) and GLE is the bell-shaped curve whose rising portion overlaps with the 
rising portion of the right-long tailed PDE as exemplifed by Figures 1g, 1i, 
1k, 1o, 1r and 1t in the above file and in Figures 15, 16, 20, 22, and 23 in 
[1].  It is clear that the greater the deviation of PDE from GLE, the greater 
is the I_P value, since GLE represents randomness and the deviation of PDE from 
GLE represents non-randomness, order, or information.


(2) There may be many physical, chemical, or mental processes  that can give 
rise to I_P by producing PDE from GLE.  One such mechanism is the so-called 
"drift-diffusion" mechanism well-known in the field of decision-making 
pyschophysics (see Figure 6 in [2]).


(3)  Another mechanism of generating PDE from Gaussian distribution is what I 
call the "Rutgers University Admissions Mechanism" (RUAM).  That is, if RAUM 
does not take into account the students' heights in their admissions process, 
the hieght distribution of the RU students would be most likely Gaussian.  
However, if RUAM favors short students over tall ones, the RU students' height 
distribution  will be skewed from the normal curve thus producing PDE.  The 
degree of skewness of PDE from its Gaussian counterpart (with an equal area 
under the curve) can be used as a measure of the information used by RAUM in 
selecting RU students.   The information derived from PDE based on its skewness 
will be referred to as the Planckian information of the second kind, I_PS, to 
distinguish it from the Planckian information defined previously (see Eqn (1)) 
which is now called the Planckian information of the first kind, I_PF:


  I_PS = - log_2 (mean - mode/standard deviation)   
(2)


  (4)  We have  found that some experimetnal data (e.g., digitized water wave 
patterns produced by the sonified Raman spectral bands measured from single 
cells) that fit PDE are better modeled with I_PF and some others (e.g., the 
mRNA levels measured from yeast cell ensembles) are better modeled with I_PS.


  (5)  If these considerations are substantiated further in the future, the 
following conclusions may be drawn:


   (a) There can be more than one kind of information that can be defined based 
on the same empirically derived mathematical euqation, depending on supporting 
physical mechanisms (or formal algorithms ?).
   (b) The reasoning in (1) suggests that the mathematically defined 
"information" is arbitrary in the sense of Saussure.
   (c)  The mathematically defined "information" can be viewed as a sign in the 
Peircean sense and hence is irreducibly triadic as depicted in Figure 1:

  f 
   g

  Reality  -> Quantitative Information 
->  Mechanism

   |
  ^
   |
   |
   |
   |
   |___|
   h

Figure 1.  The irreducibly triadic nature of the "quantitative information" or 
the "mathematical information".
  f = measurement;  g = mental process; h = correspondence, 
grounding.


(6)  Finally, it may be that PDE (or the skewed Gaussian distribution) provides 
a more general model for defining what "information" is than Shannon's 
communication system.


All the best.


Sung



References:

   [1] Ji, S. (2016).  PLANCKIAN INFORMATION (IP): A NEW MEASURE OF ORDER IN 
ATOMS, ENZYMES, CELLS, BRAINS, HUMAN SOCIETIES, AND THE COSMOS  In: Unified 
Field Mechanics: Natural Science beyond the Veil of Spacetime (Amoroso, R., 
Rowlands, P., and Kauffman, L. eds.), World Scientific, New Jersey, 2015, pp. 
579-589).  PDF at 
http://www.conformon.net/wp-content/uploads/2016/09/PDE_Vigier9.pdf
   [2] Ji, S. (2015). Planckian distributions in molecular machines, living 
cells, and brains: The wave-particle duality in biomedical 
sciences.

Re: [Fis] Data - Reflection - Information

2017-10-27 Thread Sungchul Ji
Hi FISers,


Reading the recent posts on "information" and related issues by Terry, Joseph, 
Pedro, Mark,  Krassimir, Loet, and others suggested to me the following 
possible definition of information (see Table 1) that may be consistent with 
those proposed by Terry, Shannon, Volkenstein,  Saussure, and Peirce (as I 
understand him), to varying degrees.

Table 1.  A unified definition of information based on the  Mechanism of 
Irreversible Triadic Relation (MITR):

“Information is something that is transferred from A (e.g., the sender) to C 
(e.g., the receiver) mediated by B (e.g., sound signal) in such a manner that A 
and C become coupled, correlated, or coordinated.”

 f  g
A ---> B ---> C
 |  
 ^
 |  
  |
 ||
 h




Terry

Shannon

Volkenstein

Peirce

Saussure

A

Object

Sender

-

Object

Object

B

Sign

Message

-

Sign

Sign

C

Interpretant

Receiver

-

Interpretant

-

f

Intrinsic

Coding

Amount

Natural process

Differentiation (?)

g

Referential

Decoding

Meaning

Mental process

Arbitrariness

h

Normative

Communication

Value

Correspondence/
Communication

-





I have the feeling that that number of columns in Table 1 can be increased to 
the right significantly, as we extend the MITR-based definition of information 
to various fields of inquires in natural and human sciences.


Any suggestions, comments or corrections would be welcome.


Sung







From: Terrence W. DEACON 
Sent: Sunday, October 8, 2017 8:30 PM
To: Sungchul Ji
Cc: KrassimirMarkov; foundationofinformationscience; 钟义信
Subject: Re: [Fis] Data - Reflection - Information

Against "meaning"

I think that there is a danger of allowing our anthropocentrism to bias the 
discussion. I worry that the term 'meaning' carries too much of a linguistic 
bias.
By this I mean that it is too attractive to use language as our archtypical 
model when we talk about information.
Language is rather the special case, the most unusual communicative adaptation 
to ever have evolved, and one that grows out of and depends on 
informationa/semiotic capacities shared with other species and with biology in 
general.
So I am happy to see efforts to bring in topics like music or natural signs 
like thunderstorms and would also want to cast the net well beyond humans to 
include animal calls, scent trails, and molecular signaling by hormones. And it 
is why I am more attracted to Peirce and worried about the use of Saussurean 
concepts.
Words and sentences can indeed provide meanings (as in Frege's Sinn - "sense" - 
"intension") and may also provide reference (Frege's Bedeutung - "reference" - 
"extension"), but I think that it is important to recognize that not all signs 
fit this model. Moreover,

A sneeze is often interpreted as evidence about someone's state of health, and 
a clap of thunder may indicate an approaching storm.
These can also be interpreted differently by my dog, but it is still 
information about something, even though I would not say that they mean 
something to that interpreter. Both of these phenomena can be said to provide 
reference to something other than that sound itself, but when we use such 
phrases as "it means you have a cold" or "that means that a storm is 
approaching" we are using the term "means" somewhat metaphorically (most often 
in place of the more accurate term "indicates").

And it is even more of a stretch to use this term with respect to pictures or 
diagrams.
So no one would say the a specific feature like the ears in a caricatured face 
mean something.
Though if the drawing is employed in a political cartoon e.g. with exaggerated 
ears and the whole cartoon is assigned a meaning then perhaps the exaggeration 
of this feature may become meaningful. And yet we would probably agree that 
every line of the drawing provides information contributing to that meaning.

So basically, I am advocating an effort to broaden our discussions and 
recognize that the term information applies in diverse ways to many different 
contexts. And because of this it is important to indicate the framing, whether 
physical, formal, biological, phenomenological, linguistic, etc.
For this reason, as I have suggested before, I would love to have a 
conversation in which we try to agree about which different uses of the 
information concept are appropriate for which contexts. The classic 
syntax-semantics-pragmatics distinction introduced by Charl

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-14 Thread Sungchul Ji
Hi Arturo,


I agree.  Engtropy can be negative MATHEMATICALLY, as Shroedinger assumed.

But what I am claiming is that that may be a mathematical artifact, since, 
according to the Third Law of Thermodynamics, therer is no negative entropy.


All the best.


Sung



From: tozziart...@libero.it 
Sent: Friday, October 13, 2017 6:02 PM
To: Sungchul Ji
Cc: fis@listas.unizar.es
Subject: Re[2]: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION


Dear Sung,
I'm sorry, but the "Unreasonable Effectiveness of Mathematics" still holds true.
Forget philosophical concepts like Yin and Yang, because, in some cases and 
contexts , entropy is negative.
Just to make an example,
"Since the entropy H(S|O) can now become negative, erasing a system can result 
in a net gain of work (and a corresponding cooling of the environment)."

https://www.nature.com/nature/journal/v474/n7349/full/nature10123.html

--
Inviato da Libero Mail per Android

venerdì, 13 ottobre 2017, 10:11PM +02:00 da Sungchul Ji 
s...@pharmacy.rutgers.edu<mailto:s...@pharmacy.rutgers.edu>:


Hi Arturo,


(1)  I don't understand where you got (or how you can justify) S = 1 J/K in 
your statement,


"With the same probability mass function, you can see that H = S/(ln(2)*kB), so 
setting S = 1J/K gives a Shannon entropy of 1.045×1023 bits."


(2) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
this equality mean physically ?

(3) This reminds me of what Schroedinger did when he came up with the 
conclusion that "negative entropy" is equivalent to "order", which led to 
Brillouin's so-called the "negentropy Principle of Information (NPI)" [1, 2].


Simply by multiplying the both sides of the Boltzmann equation with negative 
one, Schroedinger obtained the following formula:


 - S = - k lnW = k ln (1/W)


and then equating W with disorder, D, led him to


- S = k ln (1/D).


Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
he concluded that


"negative entropy = order".


As you can see, the above derivation is mathematically sound but the result 
violates the Third Law of Thermodynamics, according to which thermodynamic 
entropy cannot be less than zero.


Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
as follows [3]


"Schroedinger's paradox refers to the mathematical equations, concepts, or 
general statements that are formally true but physically meaningless."


(4) If my argument in (3) is valid, this may provide an example of what may be 
called


the "Unreasonable Ineffectiveness of Mathematics"


which, together with Wigner's "Unreasonable Effectiveness of Mathematics", may 
constitute an Yin-Yang pair of mathematics.


All the best.


Sung











References:
   [1]  Brillouin, L. (1953).  Negentropy Principle of Information, J. Applied 
Phys. 24(9), 1152-1163.

   [2]  Brillouin, L. (1956). Science and Information Theory, Academic Press, 
Inc., New York, pp. 152-156.

   [3] Ji, S. (2012).  The Third Law of 
Thermodynamics<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.conformon.net%2Fpublications%2Fbook-chapters%2F%23&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7C511ec2156e354837251b08d512861605%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C0%7C636435289482215942&sdata=zAKuZdQlER6RaI%2FkwVxPh4rwLSDdo8KbjelhBuTPIy8%3D&reserved=0>
 and “Schroedinger’s Paradox”<http://www.conformon.net/?attachment_id=1033>.  
In:Molecular Theory of the Living Cell: Concepts, Molecular Mechanisms, and 
Biomedical Applications.  Springer, New York.  pp. 12-15.  PDF at 
http://www.conformon.net/wp-content/uploads/2014/03/Schroedinger_paradox.pdf









From: tozziart...@libero.it<mailto:tozziart...@libero.it> 
mailto:tozziart...@libero.it>>
Sent: Friday, October 13, 2017 4:43 AM
To: Sungchul Ji; fis@listas.unizar.es<mailto:fis@listas.unizar.es>
Subject: R: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

Dear Sung,
One J/K corresponds to 1.045×1023 bits.

Indeed,
The Gibbs entropy formula states that thermodynamic entropy S equals 
kB*sum[pi*ln(1/pi)], with units of J/K, where kB is the Boltzmann constant and 
pi is the probability of microstate i. On the other hand, the Shannon entropy 
is defined as H = sum[pi*log2(1/pi)], with units of bits. With the same 
probability mass function, you can see that H = S/(ln(2)*kB), so setting S = 
1J/K gives a Shannon entropy of 1.045×1023 bits.

On the other side, The 
energy<http://7769domain.com/Ad/GoIEx2/?token=SVpHR1pPRjQ5cjMwK1E1b3NhUVdYZWNCcGxiZlJjUGhlbEFPemROVE1xYU1VTHJoZWF5T2p0d0Z5YXNDWmt2NDBGdk1jb3Eybm5JNDRoTytMVEUxUE9WaUV0bC9qMTdaZWI3UUJhQSt4S2xjaHlhcFdTUi8rWm52WVMvdndnaTZIakl3Y2JyejFqazFGM1hpb2RxWjY4UmtzQ3FuTWtlQzAzQ0FrNDFQS2Z3P

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-13 Thread Sungchul Ji
Hi Arturo,


(1)  I don't understand where you got (or how you can justify) S = 1 J/K in 
your statement,


"With the same probability mass function, you can see that H = S/(ln(2)*kB), so 
setting S = 1J/K gives a Shannon entropy of 1.045×1023 bits."


(2) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
this equality mean physically ?

(3) This reminds me of what Schroedinger did when he came up with the 
conclusion that "negative entropy" is equivalent to "order", which led to 
Brillouin's so-called the "negentropy Principle of Information (NPI)" [1, 2].


Simply by multiplying the both sides of the Boltzmann equation with negative 
one, Schroedinger obtained the following formula:


 - S = - k lnW = k ln (1/W)


and then equating W with disorder, D, led him to


- S = k ln (1/D).


Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
he concluded that


"negative entropy = order".


As you can see, the above derivation is mathematically sound but the result 
violates the Third Law of Thermodynamics, according to which thermodynamic 
entropy cannot be less than zero.


Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
as follows [3]


"Schroedinger's paradox refers to the mathematical equations, concepts, or 
general statements that are formally true but physically meaningless."


(4) If my argument in (3) is valid, this may provide an example of what may be 
called


the "Unreasonable Ineffectiveness of Mathematics"


which, together with Wigner's "Unreasonable Effectiveness of Mathematics", may 
constitute an Yin-Yang pair of mathematics.


All the best.


Sung











References:
   [1]  Brillouin, L. (1953).  Negentropy Principle of Information, J. Applied 
Phys. 24(9), 1152-1163.
   [2]  Brillouin, L. (1956). Science and Information Theory, Academic Press, 
Inc., New York, pp. 152-156.

   [3] Ji, S. (2012).  The Third Law of 
Thermodynamics<http://www.conformon.net/publications/book-chapters/#> and 
“Schroedinger’s Paradox”<http://www.conformon.net/?attachment_id=1033>.  
In:Molecular Theory of the Living Cell: Concepts, Molecular Mechanisms, and 
Biomedical Applications.  Springer, New York.  pp. 12-15.  PDF at 
http://www.conformon.net/wp-content/uploads/2014/03/Schroedinger_paradox.pdf









From: tozziart...@libero.it 
Sent: Friday, October 13, 2017 4:43 AM
To: Sungchul Ji; fis@listas.unizar.es
Subject: R: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

Dear Sung,
One J/K corresponds to 1.045×1023 bits.

Indeed,
The Gibbs entropy formula states that thermodynamic entropy S equals 
kB*sum[pi*ln(1/pi)], with units of J/K, where kB is the Boltzmann constant and 
pi is the probability of microstate i. On the other hand, the Shannon entropy 
is defined as H = sum[pi*log2(1/pi)], with units of bits. With the same 
probability mass function, you can see that H = S/(ln(2)*kB), so setting S = 
1J/K gives a Shannon entropy of 1.045×1023 bits.

On the other side, The 
energy<http://7769domain.com/Ad/GoIEx2/?token=SVpHR1pPRjQ5cjMwK1E1b3NhUVdYZWNCcGxiZlJjUGhlbEFPemROVE1xYU1VTHJoZWF5T2p0d0Z5YXNDWmt2NDBGdk1jb3Eybm5JNDRoTytMVEUxUE9WaUV0bC9qMTdaZWI3UUJhQSt4S2xjaHlhcFdTUi8rWm52WVMvdndnaTZIakl3Y2JyejFqazFGM1hpb2RxWjY4UmtzQ3FuTWtlQzAzQ0FrNDFQS2Z3PQ2>
 consumption per bit of data on the Internet is around 75 μJ at low access 
rates and decreases to around 2-4 μJ at an access rate of 100 Mb/s.
see:
http://www.ing.unitn.it/~fontana/GreenInternet/Recent%20Papers%20and%20p2p/Baliga_Ayre_Hinton_Sorin_Tucker_JLT0.pdf<http://7769domain.com/Ad/GoIEx2/?token=WCtDQW94L0I0Q0FJMVpZakdtU3NsdWxYcmFCTUF1dVg4b3lSRDFFY3FGTEZrZTZldmVIMW9ZMHgvVDFrWHZzQTZCUjNnOS8rZm11VWhiZFVDVE5IbzgyakdBV25oVDdPMVJDNHFTZEJtUXpLbUYySy9LbHlvK1Z4RHMyVGt3YmowNmdFcWhva0RuQzdiMWJ4SE1CVXlUMUxkZWJxcGloT1l3bFlQN0hyOStVPQ2>

Futher, according to Landauer's theory, a minimum amount of heat – roughly 
10–21 J per erased bit – must be dissipated when information is destroyed.
http://physicsworld.com/cws/article/news/2012/mar/12/wiping-data-will-cost-you-energy


In other words, summarizing, if you use the free energy to assess the 
information, it works the same, giving a quantifiable value.



Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Farturotozzi.webnode.it%2F&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7Cf8b6d51eca454497b53708d512168698%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C0%7C636434810329439532&sdata=t%2BaKyHxgUMMDD5KBr5zpPigvg%2BM4GacNv%2FnAosp4pnI%3D&reserved=0>


Messaggio originale
Da: "Sungchul Ji" 
Data: 12/10/2017 22.08
A: "Francesco Rizzo&q

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-12 Thread Sungchul Ji
gnaling system, with all 
its variety of environmental signals and component pathways (which have been 
called 1–2-3 Component Systems), including the role of a few second messengers 
which have been pointed out in bacteria too. And in the other side, the gene 
transcription system as depending not only on signaling inputs but also on a 
diversity of factors. Amidst the continuum of energy, matter, and information 
flows, there seems to be evidence for signaling codes, mostly established 
around the arrangement of life-cycle stages, in large metabolic changes, or in 
the relationships with conspecifics (quorum sensing) and within microbial 
ecosystems. Additionally, and considering the complexity growth of signaling 
systems from prokaryotes to eukaryotes, four avenues or “roots” for the 
advancement of such complexity would come out. A comparative will be 
established in between the signaling strategies and organization of both kinds 
of cellular systems. Finally, a new characterization of “informational 
architectures” will be proposed in order to explain the coding spectrum of both 
prokaryotic and eukaryotic signaling systems. Among other evolutionary aspects, 
cellular strategies for the construction of novel functional codes via the 
intermixing of informational architectures could be related to the persistence 
of retro-elements with obvious viral ancestry.
---


El 10/10/2017 a las 11:14, tozziart...@libero.it<mailto:tozziart...@libero.it> 
escribió:
Dear FISers,
a proposal: information might stand for free energy.

Indeed, we know that, for an engine:
enthalpy = free energy + entropy x temperature.

At a fixed temperature,
enthalpy = free energy +entropy

The information detected (from an environmental object) by an observer is not 
the total possible one (the enthalpy encompassed in the object), but just a 
part, i.e., the part that it is not uncertain for him (the free energy).  
Hence, every observer, depending on his peculiar features, detects a different 
amont of free energy and does not detect the uncertain part (the entropy).


Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/<https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Farturotozzi.webnode.it%2F&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7Cb77eb4e083e14afe3fd408d5113f296e%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C0%7C636433885376777542&sdata=ogDNynoTvgEYS9nWhADhKo%2FTo892Dku5zfask%2FngSLY%3D&reserved=0>


Messaggio originale
Da: "Christophe Menant" 
<mailto:christophe.men...@hotmail.fr>
Data: 10/10/2017 11.01
A: 
"dea...@berkeley.edu"<mailto:dea...@berkeley.edu><mailto:dea...@berkeley.edu>
Cc: 
"fis@listas.unizar.es"<mailto:fis@listas.unizar.es><mailto:fis@listas.unizar.es>
Ogg: [Fis] TR: Data - Reflection - Information



Thanks for these comments Terry.

We should indeed be careful not to focus too much on language because 'meaning' 
is not limited to human communication. And also because starting at basic life 
level allows to address 'meaning' without the burden of complex performances 
like self-consciousness or free will. (The existing bias on language may come 
from analytic philosophy initially dealing with human performances).
Interestingly, a quite similar comment may apply to continental philosophy 
where the 'aboutness' of a mental state was invented for human consciousness. 
And this is of some importance for us because 'intentionality' is close to 
'meaning'. Happily enough 'bio-intentionality' is slowly becoming an acceptable 
entity 
(https://philpapers.org/rec/MENBAM-2<https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fphilpapers.org%2Frec%2FMENBAM-2&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7Cb77eb4e083e14afe3fd408d5113f296e%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C0%7C636433885376777542&sdata=Xe9y93pS8mRR%2FFXeir1xgl7TqyRQL2iIs3zxwegkmIw%3D&reserved=0>).
Regarding Peirce,  I'm a bit careful about using the triadic approach in FIS 
because non human life was not a key subject for him and also because the 
Interpreter which creates the meaning of the sign (the Interpretant) does not 
seem that much explicited or detailed.
The divisions you propose look interesting  (intrinsic, referential, 
normative). Would it be possible to read more on that (sorry if I have missed 
some of your posts)?
Best
Christophe


De : Fis <mailto:fis-boun...@listas.unizar.es> de 
la part de Terrence W. DEACON <mailto:dea...@berkeley.edu>
Envoyé : lundi 9 octobre 2017 02:30
À : Sungchul Ji
Cc : foundationofinformationscience
Objet : Re: [Fis] Data - Reflection - Information

Against "meaning"

I think that there is a danger of allowing ou

Re: [Fis] I agree with your considerations.

2017-10-09 Thread Sungchul Ji
Hi Krassimir,


You have my permission.

Good luck.


Sung


From: Fis  on behalf of Krassimir Markov 

Sent: Monday, October 9, 2017 5:32:59 AM
To: Foundation of Information Science
Subject: [Fis] I agree with your considerations.

Dear Yixin, Sung, Terry, Mark, and FIS Colleagues,

I agree with your considerations!

Let me remark that the General Information Theory is much more than a
single concept. You have seen that I have some answers in advance due to
already developed theory.

What is important now is to finish this step and after that to continue
with the next. It may be just the idea about meaning.

What we have till now is the understanding that the information is some
more than data. In other words:

d = r
i = r + e

where:

d => data;
i => information;
r = > reflection;
e => something Else, internal for the subject (interpreter, etc.).

I need a week to finish our common with you current publication and to
send it to co-authors for final editing and after that for reviewing.

Dear Sung, Terry, and Mark, if you agree and give me the permissions, I
shall include your considerations in the end of the paper in the point
"Future work" and shall include you in the co-authors of the paper.

My next (second) post will be at the end of week.

Thank you very much for your great effort!

Friendly greetings
Krassimir



___
Fis mailing list
Fis@listas.unizar.es
https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Flistas.unizar.es%2Fcgi-bin%2Fmailman%2Flistinfo%2Ffis&data=02%7C01%7Csji%40pharmacy.rutgers.edu%7Cc9822b28b238476668a008d50ef8cabb%7Cb92d2b234d35447093ff69aca6632ffe%7C1%7C0%7C636431384101209096&sdata=SMaJh%2FxsNYFgH9yAwW0itxS3cKN2p%2BPIMJHSOMX5Jto%3D&reserved=0
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Data - Reflection - Information

2017-10-08 Thread Sungchul Ji
Hi FISers,


Recent discussions on information on this list reminds me of one of the main 
principles of signs advanced by Ferdinand de Saussure (1859-1913)  -- the 
arbitrariness of linguistic signs.  In contrast, Peirce (1839-1914), a 
chemist-turned-logician-philosopher,  seems to have succeeded in capturing the 
universal features of all signs, however fleeting, both linguistic and 
otherwise.


The power and utility of the Peircean definition of signs can be illustrated by 
applying his  triadic definition of signs to the term, 'information', veiwed as 
a sign (having an arbitrariy meaning, according to Saussure).  My impression is 
that all the varied defintions of information discussed on this list (which 
supports the Saussre's principle of the arbitrariness of signs) can be 
organized using the ITR (Irreducible Triadic Relation) diagram embodying the 
Peircean principle of semiotics.   This is done in Figure 1 below, using the 
definition of 'information' that Professor Zhong  recently provided as an 
example.  As you can see, the ITR template has 6 place-holders, 3 nodes and 3 
arrows, which can be populatedf by more than one set of concepts or terms, as 
long as the terms or concepts are consistent with one another and obeys 
well-established laws of physics and logic.


 f g
  Object -->   Sign -->
Interpretant
   (Object Information) (Data)   (Perceived 
Information)
   |
   ^
   |
|
   |
|
   |__|
h

 f = natural process (or information production)
g = mental process or computing (or information interpretation)
h = correspondence (or information flow)


Object = Something referred to by a sign

Sign = Something that stands to someone for something other than itself in
 some context.  Also called ‘representamen’
Interpretant = The effect a sign has  on the mind (or state) of the interpreter
   (human or non-human)

Figure 1.  A suggested definition of ‘information’ based on the triadic 
definition of the sign proposed by Peirce (1839-1914).  The symbol, A --- > B, 
reads as "A determines B', 'A leads to B', ' A is presupposed by B', 'B is 
supervened on A' (http://www.iep.utm.edu/superven), etc.



With all the best.


Sung



From: Fis  on behalf of 钟义信 
Sent: Sunday, October 8, 2017 4:07:53 AM
To: KrassimirMarkov; foundationofinformationscience
Subject: Re: [Fis] Data - Reflection - Information

Dear Krassiir,

The formulars you proposed in your summary is good. May I mention that the 
following formulas will be more precise:

Object Info = External info = Syntactic info = Data
Perceived info = Internal info = Syntactic info + Semantic info + Pragmatic info

In other words, data is also a kind of information - called syntactic 
information, the information without meaning and utility associated. And 
therefore we have a uniform concept of information.

So, the discussions we have last week is very much helpful!

Thank you!

--

Prof. Y. X. Zhong (钟义信)

Center for Intelligence Science Research

University of Posts & Telecommunications

Beijing 100876, China




- 回复邮件 -
发信人:Krassimir Markov mailto:mar...@foibg.com>>
收信人:foundationofinformationscience 
时间:2017年10月08日 02时06分15秒
主题:[Fis] Data - Reflection - Information


Dear FIS Colleagues,

It is time for my second post this week.

Many thanks to Christophe Menant (for the profound question) and to all
colleagues (for the very nice and useful comments)!

**

Christophe Menant had written:
“However, I'm not sure that “meaning” is enough to separate information
from data. A basic flow of bits can be considered as meaningless data.
But the same flow can give a meaningful sentence once correctly
demodulated.
I would say that:
1) The meaning of a signal does not exist per se. It is agent dependent.
- A signal can be meaningful information created by an agent (human
voice, ant pheromone).
- A signal can be meaningless (thunderstorm noise).
- A meaning can be generated by an agent receiving the signal
(interpretation/meaning generation).
2) A given signal can generate different meanings when received by
different agents (a thunderstorm noise generates different meanings for
someone walking on the beach or for a person in a house).
3) The domain of efficiency of the meaning should be taken into account
(human beings, ant-hill).
Regarding your positioning of data, I'm not sure to understand your
"reflecti

Re: [Fis] Heretic

2017-10-05 Thread Sungchul Ji
Hi FISers,


If the "information periodic table" approach  to  Information Science is right, 
which was described on this list a few days ago, the following  predictions may 
be made:


(1)  Just as there are a finite number of elements in the chemical periodic 
table that account for all the meteral objects in the Universe, so there may be 
a finite number(~ 10^2 ?)  of  token informations in the information periodic 
table that serve as the ontological basis for all the informations in the 
mental Universe.


(2) Again, just as quantum physicists recognize two kinds of attributes of  
quantum objects (also called quons or wavicles), i.e., 'static' attributes and 
'dynamic' attributes, the former being constant in time and 
observer-independent, while  the latter being time- and observer-dependent [1], 
so perhaps  information scientists  may find it necessary to recognize  two  
aspects of information -- (i) 'static' information, and (ii) 'dynamic' 
information, the former being absolute and observer-independent (also called 
'objective information' ?), while the latter is relative and observer-dependent 
(also called 'subjective information' ?).


(3)  The famous 'complementarity' principle of Bohr, the Heisenberg principle, 
and the quantum wave functions do not apply to  the static attributes of quons 
but only to their dynamic attributes [1].


(4)  There are many dual aspects of information frequently discussed in the 
field of information science, e.g., "it from bit", "static vs. dynamic",  
"objective vs. subjective:, "medium vs. message", and "signifier vs. signified" 
(see Table 1).  According to the  triadic metaphysics of Peirce [2] (as I 
understand it),  all these dualities are just the prescinded (i.e., to detach 
for the convenience of thought) aspect of the ultimate reality which is 
irreudicibly triadic [3].


(5)  As you may recall, the periodic table of information was based on the 
three nodes, A, B and C, of the ITR (Irreducible Triadic Relation) network.  It 
is interesting to note that the three categories appearing in the first row of 
Table 1 below are related to these nodes and in fact can be viewed as their 
tokens:


f   
 g

  Firstness  --->  Secondness  ---> Thirdness

 |  
   ^
 |  
   |
 |  
   |
 |___|

h


Figure 1. The isomorphism between the Peircean categoris and the ITR 
(Irreducible Triadic Relation) network.


   f = manifestation/reification; g = habit formation; h = 
correspondence/information flow



(6)  In conclusion, it may turn out that all these discussions on the concept 
of information that we are having on this list and elsewhere may turn out to be 
mere tips of enormous iceberg we call "information".



All the best.


Sung





Table 1.  The postulate that Peirce’s metaphysics [2] is a theory of 
everything.  Red = Type;  Green = Tokens

Peirce’s metaphysics

Firstness

Secondness

Thirdness

1.  Quantum mechanics

Static information

Measurement/Data

Dynamic information
(Quantum mechanical information ?)

2.  Wheeler’s theory

Ultimate Reality (?)

It

Bit

3.  Cognitive science

Objective information (?)

Sign (?)

Subjective information (?)

4.  McLuhan

Ontology

Medium/Sign

Message

5.  Saussure’ semiology

Signified

Signfier

 ?

6.  Peirce’s semiotics

Object

Sign

Interpretant

7.  Periodic  table theory of information

Time-invariant information


Data/Sign (?)

Time-dependent  information





References:

[1] Herbert, N. (1987). Quantum Reality: Beyond the New Physics and 
Excursion into  Metaphysics . . . . Anchor Books, New York.  pp. 46, 99-100, 
102, 168, 193.
[2]  Categories (Peirce).  https://en.wikipedia.org/wiki/Categories_(Peirce)
[3] Ji, S. (2017).  The Cell Language  Theory: Connecting Mind and Matter.  
World Scientific, New Jersey.  Section 10.20.


From: Fis  on behalf of Bob Logan 

Sent: Thursday, October 5, 2017 12:39 PM
To: Arturo Tozzi
Cc: fis
Subject: Re: [Fis] Heretic

Dear Arturo - I enjoyed your expression of your opinion  because of its 
directness and honesty even though I do not quite agree with everything you 
said. I enjoyed it because it provoked the following thoughts.

Yes you are right there seems to be a variety of opinions as to just what 
information is. All of them are correct and all of them are wrong including 
mine which I will share with you in a moment. They are right in that they 
describe some aspect of the notion of info

[Fis] Information Periodic Table (IPT) or the Periodic Table of Information Science (PTIS)

2017-10-03 Thread Sungchul Ji
fferent disciplines is factually impossible (or utterly irrelevant):
think on the connection between Euclidean geometry and politics, biology,
etc. I think Ortega makes right an interpretation about that. When
Aristotle makes the first classification of the sciences, he is continuing
with that very idea. Theoretical sciences, experimental or productive
sciences, and applied or practical sciences--with an emphasis on the
explanatory theoretical power of both physics and mathematics (ehm, Arturo
will agree fully with him). I have revisited my old reading notes and I
think that the Aristotelian confrontation with the Platonic approach to
the unity of knowledge that Ortega comments is extremely interesting for
our current debate on information principles.

There is another important aspect related to the first three principles in
my original message (see at the bottom). It would be rather strategic to
achieve a consensus on the futility of struggling for a universal
information definition. Then, the tautology of the first principle ("info
is info") is a way to sidestep that definitional aspect. Nevertheless, it
is clear that interesting notions of information may be provided relative
to some particular domains or endeavors. For instance, "propagating
influence" by our colleague Bob Logan, Stuart Kauffman and others, and
many other notions or partial definitions as well--I include my own
"distinction on the adjacent" as valuable for the informational approach
in biology. Is this "indefinability" an undesirable aspect? To put an
example from physics, time appears as the most undefinable of the terms,
but it shows up in almost all equations and theories of physics...
Principle three means that one can do a lot of things with info without
the need of defining it.

As for the subject that is usually coupled to the info term, as our
discussion advances further, entering the "information flows" will tend to
clarify things. The open-ended relationship with the environment that the
"informational entities" maintain via the channeling of those info
flows--it is a very special coupling indeed--allows these entities the
further channeling of the "energy flows" for self-maintenance. Think on
the living cells and their signaling systems, or think on our "info"
societies. Harold Morowitz's "energy flow in biology" has not been
paralleled yet by a similar "information flow in biology". One is
optimistic that the recent incorporation of John Torday, plus Shungchul Ji
and others, may lead to a thought-collective capable of illuminating the
panorama of biological information.

(shouldn't we make an effort to incorporate other relevant parties, also
interested in biological information, to this discussion?)

Best wishes--Pedro

El 23/09/2017 a las 21:27, Sungchul Ji escribió:

Hi Fisers,




I agree.

Communication may be the key concept in developing a theory of informaton.




Just as it is impossible to define what energy is without defining the
thermodynamic system under consideration (e.g., energy is conserved only
in an isolated system and not in closed or open systems; the Gibbs free
energy content decreases only when a spontaneous process  occurs in
non-isolsted systems with a constant temperature and pressure, etc), so it
may be that 'information' cannot be defined rigorously without  first
defining the "communication system" under consideration.   If this analogy
is true, we can anticipate that, just as there are many different kinds of
energies depending on the characteristics of the thermodynamic systems
involved, so there may be many different kinds of 'informations' depending
on the nature of the communication systems under consideration.




The properties or behaviors of all thermodynamic systems depend on their
environment, and there are three  system-environment relations -- (i)
isolated (e.g., the Universe, or the thermos bottle), (ii) closed (e.g.,
refriegerator), and (iii) open (e.g., the biosphere, living cells).




It is interesting to note that, all communication systems (e.g., cell,
organs, animals, humans) may embody ITR (Irreducible Triadic Relation)
which I  found it convenient to represent diagramamatically using a 3-node
network arrows as shown below:




 f   g

A -->  B ->  C
 |
  ^
 |
  |
 |__|
  h




Figure 1.  The Irreducible Triadic Relation (ITR) of C. S. Peirce
(1839-21914) represented as a 3-node,  closed and directed network.  The
arrows  form the commutative triangle of category theory, i.e., operations
f followed by g leads to the same result as operation h

Re: [Fis] Principles of IS

2017-09-23 Thread Sungchul Ji
Hi Fisers,


I agree.

Communication may be the key concept in developing a theory of informaton.


Just as it is impossible to define what energy is without defining the 
thermodynamic system under consideration (e.g., energy is conserved only in an 
isolated system and not in closed or open systems; the Gibbs free energy 
content decreases only when a spontaneous process  occurs in non-isolsted 
systems with a constant temperature and pressure, etc), so it may be that 
'information' cannot be defined rigorously without  first defining the 
"communication system" under consideration.   If this analogy is true, we can 
anticipate that, just as there are many different kinds of energies depending 
on the characteristics of the thermodynamic systems involved, so there may be 
many different kinds of 'informations' depending on the nature of the 
communication systems under consideration.


The properties or behaviors of all thermodynamic systems depend on their 
environment, and there are three  system-environment relations -- (i) isolated 
(e.g., the Universe, or the thermos bottle), (ii) closed (e.g., refriegerator), 
and (iii) open (e.g., the biosphere, living cells).


It is interesting to note that, all communication systems (e.g., cell, organs, 
animals, humans) may embody ITR (Irreducible Triadic Relation) which I  found 
it convenient to represent diagramamatically using a 3-node network arrows as 
shown below:


 f   g

A -->  B ->  C
 |   ^
 |   |
 |__|
  h


Figure 1.  The Irreducible Triadic Relation (ITR) of C. S. Peirce (1839-21914) 
represented as a 3-node,  closed and directed network.  The arrows  form the 
commutative triangle of category theory, i.e., operations f followed by g leads 
to the same result as operation h, here denoted as fxg = h.

f = information production; g = information interpretation; h = correspondence 
or information flow.   Please note that Processes f and g are driven by 
exergonic physicochemical processes, and h requires a pre-existing code or 
language that acts as the rule of mapping A and C.


Again, just as generations of thermodynamicists in the 19-20th centuries have 
defined various kinds of "energies" (enthalpy, Helmholtz free energy, Gibbs 
free energy) applicable to different kinds of thermodynamic systems, so 
'information scientists' of the 21st century  may have the golden opportunity 
to define as many kinds of 'informations' as needed for the different kinds of 
"communcation systems" of their interest, some examples of which being 
presented in Table 1.





Table 1.  A 'parametric' definition of information based on the values of the 
three nodes
of the ITR, Figure 1.




Communication system   A  B 
 C
(Information)




Cells DNA/RNAProteins   
  Chemcal reactions
(Biological informations)   
or chemical waves

_


HumansSenderMessage 
  Receiver
(Linguistic informations)

_


Signs  Object 
RepresentamenInterpretant
(Semiotic informations, or

'Universal informations' (?))
__


With all the best.


Sung



From: Fis  on behalf of JOHN TORDAY 

Sent: Saturday, September 23, 2017 10:44:33 AM
To: fis@listas.unizar.es
Subject: [Fis] Principles of IS

Dear Fis, I am a newcomer to this discussion, but suffice it to say that I have 
spent the last 20 years trying to understand how and why physiology has 
evolved. I stumbled upon your website because Pedro Maijuan had reviewed a 
paper of ours on 'ambiguity' that was recently published in Progr Biophys Mol 
Biol July 22, 2017 fiy.
Cell-cell communication is the basis for molecular embryology/morphogenesis. 
This may seem tangential at best to your discussion of Information Science, but 
if you'll bear with me I will get to the point. In my (humble) opinion, 
information is the 'language' of evolution, but communication of infor

Re: [Fis] INFORMATION: JUST A MATTER OF MATH

2017-09-18 Thread Sungchul Ji
Hi Fisers,


(1) I have been workign with the so-called Planckain Distribution Eqauiton 
(PDE) that fits almost all long-tailed histograms generated in many fields  of 
natural and human sciences, including atomic physics, protein folding, cell 
metabolism, decision making, glottometrics, econometrics, etc [1-3].


(2) Defining what 'information' is may be akin to defining what 'PDE' is.  In 
both cases, what we want to define has what may be called the type-token 
duality (TTD) as indiacted in Table 1.

The type of 'information' may not be  easily defined in words alone but may be 
defined unambiguously only in terms of a combination of words and a digram as 
shown in the first column in Table 1.  Similarly, defining what PDE is may be 
almost impossible  using words alone but must utilize mathematical symbols, not 
natural linguistic symbols (see the top of the second column)


(3) Just as there are almost infinite nubmer of the 'tokens' of the 'PDE type', 
each 'token PDE' being characterized by a unique set of three numerical values 
of A, B and C (see the second column of Table 1), so there may be almost 
infinite number of the 'tokens' of the 'information type', each 'token 
information' being characterized by a unique set of 6 parameters, A, B, C, f, 
g, and h that constitute the ITR, Irreducible Triadic relation [4].  Such a 
definition of 'information' may be consistent with the 'parametric definition 
of information' proposed by Mark in 2010 [5].


(4) As I intend to discuss it in a more detail in a later post, it is 
interesting to point out that the shape of PDE curve embodies both the Shannon 
entropy (the width of the curve as a measure of disorder) and the Planckian 
information (the binary logarithm of the non-symmetric portion of the curve 
denoted as I_P as a measure of order) [1-4].  Furthermore, I_P quantifies the 
informational aspect of what I call "gnergy" [6], the complementary union of 
information (gn-) and energy (-ergy), that has been postulated to drive all 
self-organizations in the Universe, from chemical reactions to the 
organizations of the Cosmos, including the mental activities of Homo sapiens 
[4, 7].




Table 1.  The type-token duality (TTD) of concepts, including information and 
parametric equations.



Information

Planckian Distribution Equation (PDE)

Type

fg
A > B > C
 |   ^
 ||
 |__|
h

Irreducible Triadic Relation (ITR)
=>  f x g = h.
“Operation f followed by operation g leads to the same result as operation h”



   y = (A/(x + B)5)/(eC/(x +B) -1)




y = frequency or probability
x = classes or bins
A, B and C = free parameters

Tokens

Semiotics:
A = object; B = sign; C = interpretant
f = sign production
g = sign interpretation
h = correspondence or information flow


Protein folding:
y = frequency
x = folding free energy
A = 1.24x109
B = 0.721
C = 38.97

Aristotle:
A = Hylomorph
B = Matter
C = Form
f = natural process
g = mental process
h = information flow

mRNA levels in yeast under stress:
y = frequency or probability
x = mRNA copy number class or bin
A = 1.11x1012
B = 13.96
C = 159.30

Spinoza:
A = Substance
B = Extension
C = Thought
f = natural process
g = mental process
h = information flow

mRNA levels in human breast tissues:
y = frequency or probability
x = mRNA copy number class or bin
A = 2.36x103
B = 0.213
C = 5.00

General relativity:
A = Matter
B = curved spacetime
C = motion
f =  mass-induced curvature
g = geodesic motion
h = gravity

Protein chain length distribution:
y = frequency or probability
x = protein chain length
A = 2.04x1013
B = 5.655
C = 1.257x103




With alll the best.


Sung




Referecnes:


  [1]

Ji, S. (2016). WAVE-PARTICLE DUALITY IN PHYSICS AND BIOMEDICAL 
SCIENCES.<http://www.conformon.net/wp-content/uploads/2016/09/PDE_SymmetryFestival_2016.pdf>
  Symmetry: Science and Culture 27 (2): 99-127 (2016).  PDF at 
http://www.conformon.net/wp-content/uploads/2016/09/PDE_SymmetryFestival_2016.pdf

Sungchul Ji - 
conformon.net<http://www.conformon.net/wp-content/uploads/2016/09/PDE_SymmetryFestival_2016.pdf>
www.conformon.net
single-molecule enzymology, gene expression profiles, econophysics, 
glottometrics, Menzerath-Altmann law, information-energy complementarity, 
yin-yang ...

   [2]  Ji, S. (2015). Planckian distributions in molecular machines, living 
cells, and brains: The wave-particle duality in biomedical 
sciences.<http://www.conformon.net/wp-content/uploads/2016/09/PDE_Vienna_2015.pdf>
  In: Proceedings of the International Conference on Biology and Biomedical 
Engineering, Vienna, March 15-17, 2015. Pp. 115-137. PDF at  
http://www.conf

Re: [Fis] planckian information

2017-04-15 Thread Sungchul Ji
Hi Pedro,


(1)  You wrote  "What I do not see is the relationship to be established with 
languages: at cosmic, cellular, and human scales. Is it necessary? Is it 
convenient? "


The connection between the Planckian information and languages (cosmic, human 
and cellular) may be described in terms of the following three general 
statements (to be referred to as laws):


The First Law of Infomratics [1]: Information can but entropy cannot be 
negative; or the Planckian information is not the negative Shannon entropy.

The Second Law of Informatics: No informatin can be transferred from the sender 
to the reciever without a language or a code understood by both.


The Third Law of Informatics: No informationc an be transferred from the sender 
to the receiver without dissipating at leat kT jules of energy per bit.



These laws can be viewed as reflecting different aspects of the phenomenon of 
communication schematically shown in Figure 1.  Step h implicates 
'information', while Steps f and g, being driven by free energy dissipation, 
implicates entropy via the Gibbs free enrgy equaiton,  G = E + PV - TS, where E 
= energy, P = pressure, V = volume, T = temperature and S = entropy.


 fg

Sender/Source ---> Message ---> Reciever

 |  
^
 |  
|
 |  
|

 ||

 h


Figure 1.  Communication as an irreducible triadic relation (ITR).  f = 
selection; g = recognition; h = information flow enabled by the common language 
or code.


(2)  The Planckian information (I_P) is an aspect of the Planckian process, 
defined as "any physicochemical or formal processes that produce (by selecting 
subsets from a random set [2]) long-tailed histograms fitting the Planckian 
Distribution Equation (PDE)".  As you may recall, I_P is mathmetically defiend 
in terms of PDE and GLE (Gaussian-like Equation).  Thus defined the Planckain 
process embodies the ITR as follows:


  f 
   g

  Random set ->  histograms  ---> PDE
|   
 ^

|   
 |
|   
 |
|_|

   h



Figure 2.  A diagrqammatic representation of the Planckian process as an ITR.  
f = some selection mechanism; g = only some long-tailed histograms fit PDE; h = 
PDE originates  in (or reflects the intrinsic properties of ) the original 
random set.


(3) If Figures 1 and 2 are right, languages are components of ITR as is I_P.  
Thus, I_P is related to languages via the principle of ITR.



(4) I believe that both the human language and the cell language embody ITR and 
hence are isomorphic.  This concluion is supported by  (i) a detailed 
comparison of these two kinds of languages as summarized in the two tables  
attached and (ii) the recent findings that  PDE fits the long tailed histograms 
generated by human speech and cell metabolism (see the related figures attached 
to my first FIS post entitled What Is the Planckian Informatiion ?).


(5)  I further believe that there are many abiotic processes that are 
Planckian.  That is, there are many non-living processes that produce 
histograms fitting PDE and hnce Planckian, e.g., the blackbody radiation 
process itself (see the related figure in my first FIS post mentioned aobve) 
and the proto-typical self-organizing chemical reaction-diffusion system known 
as the Belousove-Zhabotinsky reaction [3].   Therefore there must exist a 
cosmic language (or cosmese) [4].


(6)  The three languages discussed here, cellese, humanese, and cosmese appear 
to share a set of common principles --- (i) ITR, (ii) the Planckian process, 
and (iii) the wave-particle duality discussed elsewhere [5].



If you have any questions or corrections, please let me know.


All the best.


Sung




References:

   [1] Ji, S. (2017).  The Cell Language Theory: Connecting Matter and Mind.  
World Scientific, New Jersey. In press.  Section 8.5.2.
   [2] Ji, S. (2017).  ibid.  Section 8.4.1.

   [3] Ji, S. (2017).  ibid.  Section 9.4.3.
   [4]  Ji, s. (2012).  Molecular Theory of the Living Cell: concepts, 
Molecular Mechanisms, and Biomedical Applicaitons.  Springer, New York. Table 
2.13, pp. 44-45.
   [5] Ji, S. (2016). Waves as the Symmetry Principle Underlying Cosmi

Re: [Fis] Causation is transfer of information

2017-03-29 Thread Sungchul Ji
Hi Soeren and FISers,


(1) Tychism is intrinsic to the Planckian information, since it is defined as 
the binary logarithm of the ratio of the area under the curve (AUC) of the 
Planckian distribution (PDE)  over the AUC of the Gaussian-like Equation (GLE):


  I_P  =  log (AUC(PDE)/AUC(GLE))


Tychism is implied in GLE.


(2)  The Planckian processes are defined as those physicochemical or formal 
processes that generate long-tailed histograms (or their superpositions) 
fitting PDE (or its suppositions).   The Planckian process seems irreducibly 
triadic in the Peircean sesne:



f   
g

  Random processes --->   Long-tailed histograms  
->  PDE

 (Firstness)   
(Secondness) (Thirdness)

 |  
   ^

 |  
   |

 |  
   |

 |  
   |

 ||


h


Figure 2.  The Irreducible Triadic Relation (ITR) embodied in the Planckian 
processes.  f = selection process either natural or artificial; g =  
mathematical modeling; h = grounding, correspondence, or information flow.


(3)  (to be continued)


All the best.


Sung






From: Søren Brier 
Sent: Wednesday, March 29, 2017 7:06 PM
To: Sungchul Ji; Terrence W. DEACON; John Collier
Cc: fis
Subject: RE: [Fis] Causation is transfer of information


Dear Sung



It is difficult for me to say as you do not make your metaphysical framework 
explicit.  This was the great work Peirce did. I am pretty sure you do not have 
a dynamic triadic process concept of semiosis based on a tychastic theory of 
Firstness as potential qualia or forms of feeling of which information is only 
an aspect.



Best

   Søren



From: Sungchul Ji [mailto:s...@pharmacy.rutgers.edu]
Sent: 29. marts 2017 20:35
To: Søren Brier; Terrence W. DEACON; John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Hi Soeren,



Can  you be more specific about what aspect of my proposal described in my 
previous emails you think are my own and has nothing to do with (or are even 
based on my misinterpretation of) Peirce ?



Thanks in advance.



Sung











From: Søren Brier mailto:sbr@cbs.dk>>
Sent: Wednesday, March 29, 2017 2:10 PM
To: Sungchul Ji; Terrence W. DEACON; John Collier
Cc: fis
Subject: RE: [Fis] Causation is transfer of information



Dear Sung



I suggest you call this your own theory and make your own definitions of terms, 
because you confuse things by attempting to draw on Peirce, because there is a 
whole process philosophy with synechism, tychism, agapism and Scholastic 
realism plus a phenomenological and mathematically based  triadic metaphysics 
as the basis of Peirce’s concepts, which is the fruit of his life’s work. I do 
not think you are ready to carry that load. It takes many years to understand 
fully. The ‘sign’ is a triadic process of representamen, object and 
interpretant working in the realm of Firstness, Secondness and Thirdness in a 
society at large or a society of researchers devoted to the search for truth 
producing the meaning of signs, which when developed into propositional 
arguments can be tested in the fallible scientific process  of generating more 
rationality in culture as well as nature.



Best

   Søren



From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Sungchul Ji
Sent: 29. marts 2017 00:27
To: Terrence W. DEACON; John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Hi Fisers,



I agree with Terry that "information" has three irreducible aspects --- amount, 
meaning, and value.  These somehow may be related to another triadic relation 
called the ITR as depicted below, although I don't know the exact rule of 
mapping between the two triads.  Perhaps, 'amount' = f, 'meaning' = g, and 
'value' = h ? .



  f   g

   Ob

Re: [Fis] Causation is transfer of information

2017-03-29 Thread Sungchul Ji
Hi Soeren,


Can  you be more specific about what aspect of my proposal described in my 
previous emails you think are my own and has nothing to do with (or are even 
based on my misinterpretation of) Peirce ?


Thanks in advance.


Sung






From: Søren Brier 
Sent: Wednesday, March 29, 2017 2:10 PM
To: Sungchul Ji; Terrence W. DEACON; John Collier
Cc: fis
Subject: RE: [Fis] Causation is transfer of information


Dear Sung



I suggest you call this your own theory and make your own definitions of terms, 
because you confuse things by attempting to draw on Peirce, because there is a 
whole process philosophy with synechism, tychism, agapism and Scholastic 
realism plus a phenomenological and mathematically based  triadic metaphysics 
as the basis of Peirce’s concepts, which is the fruit of his life’s work. I do 
not think you are ready to carry that load. It takes many years to understand 
fully. The ‘sign’ is a triadic process of representamen, object and 
interpretant working in the realm of Firstness, Secondness and Thirdness in a 
society at large or a society of researchers devoted to the search for truth 
producing the meaning of signs, which when developed into propositional 
arguments can be tested in the fallible scientific process  of generating more 
rationality in culture as well as nature.



Best

   Søren



From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Sungchul Ji
Sent: 29. marts 2017 00:27
To: Terrence W. DEACON; John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Hi Fisers,



I agree with Terry that "information" has three irreducible aspects --- amount, 
meaning, and value.  These somehow may be related to another triadic relation 
called the ITR as depicted below, although I don't know the exact rule of 
mapping between the two triads.  Perhaps, 'amount' = f, 'meaning' = g, and 
'value' = h ? .



  f   g

   Object --->  Sign -->  Interpretant

|   
 ^
|   
 |
|   
 |
|   
 |
|_|

h



Figure 1.  The Irreducible Triadic Relation (ITR) of seimosis (also called sign 
process or communication) first clearly articulated by Peirce to the best of my 
knowledge. Warning: Peirce often replaces Sign with Representamen and 
represents the whole triad, i.e., Figure 1 itself (although he did not use such 
a figure in his writings) as the Sign. Not distinguishing between these two 
very different uses of the same word "Sign" can lead to semiotic confusions.   
The three processes are defined as follows: f = sign production, g = sign 
interpretation, h = information flow (other ways of labeling the arrows are not 
excluded).   Each process or arrow reads "determines", "leads", "is presupposed 
by", etc., and the three arrows constitute a commutative triangle of category 
theory, i.e., f x g = h, meaning f followed by g ledes to the same result as h.



I started using  the so-called  ITR template, Figure 1,  about 5 years ago, and 
the main reason I am bringing it up here is to ask your critical opinion on my 
suggestion published in 2012 (Molecular Theory of the Living  Cell: Concepts, 
Molecular Mechanisms, and Biomedical Applications, Springer New York, p ~100 ?) 
that there are two kinds of causality -- (i) the energy-dependent causality 
(identified with Processes f and g in Figure 1) and (ii) the information (and 
hence code)-dependent causality (identified with Process h).  For convenience, 
I coined the term 'codality' to refer to the latter to contrast it with the 
traditional term causality.



I wonder if we can  view John's idea of the relation between 'information' and 
'cause' as being  an alternative way of expressing the same ideas as the 
"energy-dependent causality" or the "codality" defined in Figure 1.



All the best.



Sung







From: Fis mailto:fis-boun...@listas.unizar.es>> 
on behalf of Terrence W. DEACON 
mailto:dea...@berkeley.edu>>
Sent: Tuesday, March 28, 2017 4:23:14 PM
To: John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Corrected typos (in case the intrinsic redundancy didn't compensate for these 
minor corruptions of the text):



 information-beqaring medium =  information-bearing medi

Re: [Fis] Causation is transfer of information

2017-03-28 Thread Sungchul Ji
Hi Fisers,


I agree with Terry that "information" has three irreducible aspects --- amount, 
meaning, and value.  These somehow may be related to another triadic relation 
called the ITR as depicted below, although I don't know the exact rule of 
mapping between the two triads.  Perhaps, 'amount' = f, 'meaning' = g, and 
'value' = h ? .


  f   g

   Object --->  Sign -->  Interpretant

|   
 ^
|   
 |
|   
 |
|   
 |
|_|

h


Figure 1.  The Irreducible Triadic Relation (ITR) of seimosis (also called sign 
process or communication) first clearly articulated by Peirce to the best of my 
knowledge. Warning: Peirce often replaces Sign with Representamen and 
represents the whole triad, i.e., Figure 1 itself (although he did not use such 
a figure in his writings) as the Sign. Not distinguishing between these two 
very different uses of the same word "Sign" can lead to semiotic confusions.   
The three processes are defined as follows: f = sign production, g = sign 
interpretation, h = information flow (other ways of labeling the arrows are not 
excluded).   Each process or arrow reads "determines", "leads", "is presupposed 
by", etc., and the three arrows constitute a commutative triangle of category 
theory, i.e., f x g = h, meaning f followed by g ledes to the same result as h.


I started using  the so-called  ITR template, Figure 1,  about 5 years ago, and 
the main reason I am bringing it up here is to ask your critical opinion on my 
suggestion published in 2012 (Molecular Theory of the Living  Cell: Concepts, 
Molecular Mechanisms, and Biomedical Applications, Springer New York, p ~100 ?) 
that there are two kinds of causality -- (i) the energy-dependent causality 
(identified with Processes f and g in Figure 1) and (ii) the information (and 
hence code)-dependent causality (identified with Process h).  For convenience, 
I coined the term 'codality' to refer to the latter to contrast it with the 
traditional term causality.


I wonder if we can  view John's idea of the relation between 'information' and 
'cause' as being  an alternative way of expressing the same ideas as the 
"energy-dependent causality" or the "codality" defined in Figure 1.


All the best.


Sung




From: Fis  on behalf of Terrence W. DEACON 

Sent: Tuesday, March 28, 2017 4:23:14 PM
To: John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information

Corrected typos (in case the intrinsic redundancy didn't compensate for these 
minor corruptions of the text):

 information-beqaring medium =  information-bearing medium

appliction = application

 conceptiont =  conception

On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON 
mailto:dea...@berkeley.edu>> wrote:
Dear FIS colleagues,

I agree with John Collier that we should not assume to restrict the concept of 
information to only one subset of its potential applications. But to work with 
this breadth of usage we need to recognize that 'information' can refer to 
intrinsic statistical properties of a physical medium, extrinsic referential 
properties of that medium (i.e. content), and the significance or use value of 
that content, depending on the context.  A problem arises when we demand that 
only one of these uses should be given legitimacy. As I have repeatedly 
suggested on this listserve, it will be a source of constant useless argument 
to make the assertion that someone is wrong in their understanding of 
information if they use it in one of these non-formal ways. But to fail to mark 
which conception of information is being considered, or worse, to use equivocal 
conceptions of the term in the same argument, will ultimately undermine our 
efforts to understand one another and develop a complete general theory of 
information.

This nominalization of 'inform' has been in use for hundreds of years in legal 
and literary contexts, in all of these variant forms. But there has been a 
slowly increasing tendency to use it to refer to the information-beqaring 
medium itself, in substantial terms. This reached its greatest extreme with the 
restricted technical usage formalized by Claude Shannon. Remember, however, 
that this was only introduced a little over a half century ago. When one of his 
mentors (Hartley) initially introduced a logarithmic measure of signal capacity 
he called it 'intelligence' — as in the gathering of intelligence by a spy 
organization. So had Shannon chose 

Re: [Fis] Information: a metaphysical word

2017-03-27 Thread Sungchul Ji
Hi FISers,


I wonder some of he controversies  surrounding information can be traced to 
conflating  TYPE and TOKENS.  For example, apples and oranges (tokens) are not 
the same but they are  all species of the fruit (type).  Likewise the 
information exchanged among non-living systems like computers are not the same 
as the information exchanged among humans or between humans and machines, but 
they are species of the type called information which to me is characterized by 
the Irreducible Triadic Relation (ITR) of C. S. Peirce.


If this line of reasoning is valid, it may be justified to coin a new term, 
"informons", to refer to specific examples carrying Information.


All the best.


Sun


From: Fis  on behalf of Robert E. Ulanowicz 

Sent: Monday, March 27, 2017 11:37:31 AM
To: tozziart...@libero.it
Cc: fis
Subject: Re: [Fis] Information: a metaphysical word

Dear Arturo,

I am less pessimistic than you about treating and measuring information.

First off, that information is always relative is the obverse of the third
law of thermodynamics. It cannot be otherwise.


Secondly, you are correct that there are important metaphysical aspects of
information. To my knowledge, it is the only discipline predicated on
*absence* -- the absence of constraint (popularly characterized as
"uncertainty"). We know from the third law that such entropic-like
measures are always relative to some assumed reference. Actual information
is calculated as a decrease in apophasis and shares that same relativity.

While you might feel that the metaphysical associations disqualify
information as an instrument of science, I would suggest that it rather
opens a new window onto our vision of reality.


Should you think information measures useless because of such metaphysical
associations, I would submit that measures of apophasis can be quite
useful in remediation of environmental problems (and problems in a host of
other realms as well). (See the example beginning on p51 of
.)

Let me end by saying that I understand fully your exasperation with
information theory (IT). For almost two decades I abjured IT, because I
considered it nonsensical that a TV screen with "snow" (no signal) should
have more information that a picture of a movie star. (My vexation was
based on similar reasons as yours.) Then it finally dawned on me that some
of the founders of IT had made serious pedagogical errors with their
definitions. I eventually sorted out my own perspective (See Chapter 5 of
), and went
on to build my entire career on concepts related to IT.

I would encourage you give it all another look. IT can be quite rewarding!

Peace,
Bob

>
> Dear FISers,
> The current debate about information has just a possible development, I
> think.
> Everybody defines information in the way he prefers: subjective, biotic,
> bit, and so on.
> Therefore, every study that talks about "information" is meaningless.
> In particular, subjective accounts of information are useless, because, in
> their framework, the information is not measurable, but just depends on
> the observer: if me, John and Mary see the same can, I think that the Coke
> is good, John thinks that he is thirsty and Mary that the aluminium is a
> malleable material.
> On the other side, I suggested in a previous post how the information
> entropy (such as Shannon's, or Bekenstein's, or Hawking's) may change
> according to the relativistic speed of the hypothetical observer.Â
> Therefore, I suggest to fully remove the term "information" from every
> scientific account.  The term "information" refers, in Popper's terms, to
> a not falsifiable theory, to pseudoscience: it is a metaphysical claim,
> like the concepts of Essence, Being, God and so on.Â
> Therefore, by now, the term "information" is definitely out of my
> scientific  vocabulary.Â
> Â
> --
> Inviato da Libero Mail per
> Android___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] PLANCKIAN INFORMATION: A NEW MEASURE OF ORDER (From S. Ji)

2017-03-23 Thread Sungchul Ji
Hi Pedro,


Thanks for the excellent job done.


Sung



From: Fis  on behalf of Pedro C. Marijuan 

Sent: Thursday, March 23, 2017 6:25 AM
To: 'fis'
Subject: [Fis] PLANCKIAN INFORMATION: A NEW MEASURE OF ORDER (From S. Ji)

Note: what follows is an abbreviated text taken from the presentation.
The whole file, too big for our list, can be found at fis web pages:
http://fis.sciforum.net/wp-content/uploads/sites/2/2014/11/Planckian_information.pdf
A very recent article developing similar ideas: 
http://www.mdpi.com/2078-2489/8/1/24
[http://www.mdpi.com/img/journals/information-logo-sq.png?a1aee442a5e8cd96]<http://www.mdpi.com/2078-2489/8/1/24>

Information | Free Full-Text | Waves as the Symmetry 
...<http://www.mdpi.com/2078-2489/8/1/24>
www.mdpi.com
In 1997, the author concluded that living cells use a molecular language 
(cellese) that is isomorphic with the human language (humanese) based on his 
finding that the ...



Greetings to all--Pedro
---


What is the Planckian information ?

SUNGCHUL JI

Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
s...@pharmacy.rutgers.edu<mailto:s...@pharmacy.rutgers.edu>


The Planckian information (I_P) is defined as the information produced (or 
used) by the so-called Planckian processes which are in turn defined as any 
physicochemical or formal processes that generate long-tailed histograms 
fitting the Planckian Distribution Equation (PDE),
   y = (A/(x + B^5)/(Exp(C/(x + B)) – 1)
  (1)
 where A, B and C are free parameters, x is the class or the bin to which  
objects or entities belong, and y is the frequency [1, 1a].  The PDE was 
derived in 2008 [2] from the blackbody radiation equation discovered by M. 
Planck (1858-1947) in 1900, by replacing the universal constants and 
temperature with free parameters, A, B and C.  PDE has been found to fit not 
only the blackbody radiation spectra (as it should) but also numerous other 
long-tailed histograms [3, 4] (see Figure 1).
One possible explanation for the universality of PDE is that many long-tailed 
histograms are generated by some selection mechanisms acting on 
randomly/thermally accessible processes [3]. Since random processes obey the 
Gaussian distribution, the ratio of the area under the curve (AUC) of PDE to 
that of Gaussian-like symmetric curves can be used as a measure of 
non-randomness or the order generated by the Planckian processes.

As can be seen in Figs. 1 (g), (i), (k), (o), (r) and (t), the curves labeled 
‘Gaussian’ or ‘Gaussian-like’ overlap with the rising phase of the PDE curves.  
The ‘Gaussian-like’ curves were generated by Eq. (2), which was derived from 
the Gaussian equation by replacing its pre-exponential factor with free 
parameter A:

  y = Ae– (x – μ)^2/(2σ^2)  
  (2)

The degree of mis-match between the area under the curve (AUC) of PDE, Eq. (1), 
and that of GLE, Eq. (2), is postulated to be a measure of non-randomness (and 
hence order).  GLE is associated with random processes, since it is symmetric 
with respect to the sign reversal of in its exponential term, (x - µ).  This 
measure of order is referred to as the Planckian Information (IP) defined 
quantitatively as shown in Eq. (3) or Eq. (4):

  IP = log2 (AUC(PDE)/AUC(GLE))   bits  
   (3)
or

  IP  = log2 [∫P(x)dx/∫G(x)dx] bits 
(4)

where P(x) and G(x) are the Plackian Distribution Equation and the 
Gaussian-Like Equation, respectively.

It is generally accepted that there are at least three basic aspects to 
information – amount, meaning, and value.  Planckian information is primarily 
concerned with the amount (and hence the quantitative aspect) of information.  
There are numerous ways that have been suggested in the literature for 
quantifying information bedside the well-known Hartley information, Shannon 
entropy, algorithmic information, etc [5].  The Planckian information, given by 
Equation (3), is a new measure of information that applies to the Planckian 
process generally defined as in (5):

“Planckian processes are the physicochemical, neurophysiological,   
 (5)
biomedical, mental, linguistic, socioeconomic, cosmological, or any
other processes that generate long-tailed histograms obeying the
Planckian distribution equation (PDE).”

The Planckian information represents the degree of organization of physical (or 
nonphysical) systems in contrast to the Boltzmann or the Boltzmann-Gibbs 
entropy which represents the disorder/disorganization of a physical