### Re: [Fis] If always n>0 why we need log

```I am in agreement with Guy Hoelzer in his assessment of the use of
log-transformed data.
Since I regularly deal with biological growth processes, using
log-transformed data is the clearest way to anaylyze proportional
relationships in nonlinear sysrtems.
By virtue of the way it compresses multiplicative relations log
transformation makes scale-free comparison much more tractable and
correlations much more obvious.
And compression is one of the most important benefits of mathematical
analysis.

On Sun, Jun 3, 2018 at 2:04 PM, Guy A Hoelzer  wrote:

> Dear Sung et al.,
>
> I appreciate human bias in terms of numerical scale, but I don’t think
> that is what we actually achieve by using logarithms.  If the universe of
> possibility is fractal, using a logarithm does not eliminate the problem of
> large numbers.  I think the primary outcome achieved by using logarithms is
> that units come to represent proportions rather than absolute (fixed scale)
> amounts.  It reveals an aspect of scale-free form.
>
>
>
> On Jun 3, 2018, at 10:42 AM, Sungchul Ji  wrote:
>
> Hi Krassimir,
>
> I think the main reason that we express 'information'  as a logarithmic
> function of the number of choices available, n, may be because the human
> brain finds it easier to remember (and communicate and reason with)  10
> than  100, or 100 than 10. . . . 0, etc.
>
> All the best.
>
> Sung
>
>
>
> --
> *From:* Krassimir Markov
> *Sent:* Sunday, June 3, 2018 12:06 PM
> *To:* Foundation of Information Science
> *Cc:* Sungchul Ji
> *Subject:* If always n>0 why we need log
>
> Dear Sung,
>
> A simple question:
>
>
> I = -log_2(m/n) = - log_2 (m) + log_2(n)   (1)
>
> Friendly greetings
>
> Krassimir
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http%3A%2F%2Flistas.unizar.es%2Fcgi-bin%2Fmailman%
> 2Flistinfo%2Ffis=01%7C01%7Choelzer%40unr.edu%
> 7C82bf20333c6c4fd9707c08d5c97971b4%7C523b4bfc0ebd4c03b2b96f6a17fd
> 31d8%7C1=jOf1JAeFzo8p1ymXpGvzLgJ25ZBeFI6sVksQvbpQYhU%3D=0
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>

--
Professor Terrence W. Deacon
University of California, Berkeley
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] If always n>0 why we need log

```Dear Sung et al.,

I appreciate human bias in terms of numerical scale, but I don’t think that is
what we actually achieve by using logarithms.  If the universe of possibility
is fractal, using a logarithm does not eliminate the problem of large numbers.
I think the primary outcome achieved by using logarithms is that units come to
represent proportions rather than absolute (fixed scale) amounts.  It reveals
an aspect of scale-free form.

On Jun 3, 2018, at 10:42 AM, Sungchul Ji
mailto:s...@pharmacy.rutgers.edu>> wrote:

Hi Krassimir,

I think the main reason that we express 'information'  as a logarithmic
function of the number of choices available, n, may be because the human brain
finds it easier to remember (and communicate and reason with)  10 than
100, or 100 than 10. . . . 0, etc.

All the best.

Sung

From: Krassimir Markov mailto:mar...@foibg.com>>
Sent: Sunday, June 3, 2018 12:06 PM
To: Foundation of Information Science
Cc: Sungchul Ji
Subject: If always n>0 why we need log

Dear Sung,

A simple question:

I = -log_2(m/n) = - log_2 (m) + log_2(n)   (1)

Friendly greetings

Krassimir

___
Fis mailing list
Fis@listas.unizar.es

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] No, this is not the reason.

```Dear Krassimir and Sungchul,

I suppose this bears out Von Neumann's tongue-in-cheek advice to
Shannon! (http://www.eoht.info/page/Neumann-Shannon+anecdote)

Krassimir, just to ask about Boltzmann's use of the logs... I first
understood this to be a measure of the probability distribution of a
whole thermodynamic system which factorises into the product of
probabilities of microstates in the system. Hence the logs (and hence
Shannon equating of "microstate" for "alphabet", which seems
reasonable at first glance)...

EXCEPT I very much like the explanation that Bob Ulanowicz gives here
(in http://www.mdpi.com/2078-2489/2/4/624) - which doesn't mention the
factorising of the probabilities of microstates, but instead argues
that -log (p(i)) gives a value for what isn't there (the "apophatic",
"absence"), and Bob's criticism of Shannon for inverting this by
turning his H into a measure of surprise:

"Boltzmann described a system of rarefied, non-interacting particles
in probabilistic fashion. Probability theory quantifies the degree to
which state i is present by a measure, p(i). Conventionally, this
value is normalized to fall between zero and one by dividing the
number of times that i has occurred by the total number of
observations. Under this “frequentist” convention, the probability of
i not occurring becomes (1 − p(i)). Boltzmann’s genius, however, was
in abjuring this conventional measure of non-occurrence in favor of
the negative of the logarithm of pi.  (It should be noted that
−log(p(i)) and (1 − p(i)) vary in uniform fashion, i.e., a one-to-one
relationship between the two functions exists). His choice imposed a
strong asymmetry upon matters. Conventionally, calculating the average
nonbeing in the system using (1 − p(i)) results in the symmetrical
parabolic function (p(i) − p(i)^2). If, however, one calculates
average absence using Boltzmann’s measure, the result becomes skewed
towards smaller p(i) (or larger [1 − p(i)]), i.e., towards nonbeing."

It's such a useful equation, and I agree, "Why are the logs there?" is
an important question.

Best wishes,

Mark

On 3 June 2018 at 20:22, Krassimir Markov  wrote:
> Dear Sung,
>
> You wrote:
>> I think the main reason that we express 'information'  as a logarithmic
> function of the number of choices available, n, may be because the human
> brain finds it easier to remember (and communicate and reason with)  10
> than  100, or 100 than 10. . . . 0, etc.
>>
>
> No, this is not the reason.
> The correct answer is that Shannon assume the n=0 as possible !!!
> Because of this, to avoid dividing by zero he used log(s).
> But this is impossible and many years the world works with log(s) not
> understanding why !
>
> log(s) is(are) no needed.
>
> It is more clear and easy to work without log(s) :=)
>
> Friendly greetings
> Krassimir
>
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

--
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] If always n>0 why we need log

```Sorry Sung, you know about the rules of engagement in this list... you
have gone to 5 msgs. And that means one and half weeks of sanction. Even
more after having warned you privately several times.
Anyhow, tomorrow I will make public an embarrassing bureaucratic
procedure that the list has to suffer next days.

Best wishes--Pedro

El 03/06/2018 a las 19:42, Sungchul Ji escribió:

Hi Krassimir,

I think the main reason that we express 'information'  as a
logarithmic function of the number of choices available, n, may
be because the human brain finds it easier to remember (and
communicate and reason with)  10 than  100, or 100 than
10. . . . 0, etc.

All the best.

Sung

*From:* Krassimir Markov
*Sent:* Sunday, June 3, 2018 12:06 PM
*To:* Foundation of Information Science
*Cc:* Sungchul Ji
*Subject:* If always n>0 why we need log
Dear Sung,

A simple question:

I = -log_2(m/n) = - log_2 (m) + log_2(n)   (1)

Friendly greetings

Krassimir

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

--
-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta 0
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 (& 6818)
pcmarijuan.i...@aragon.es
-

---
El software de antivirus Avast ha analizado este correo electrónico en busca de
virus.
https://www.avast.com/antivirus
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Logarithm

```For establishing the upper limit of the maximal number of commutative
groups on sets, the logarithm well pictures the decreasing probabilities of
finding a new constellation of symbols, by the ever increasing number of
factors in the divisor.

Hans von Baeyer  schrieb am So., 3. Juni 2018 21:53:

> For entropy we do need the log, because the chemists already knew that it
> is additive, whereas probability and "the number of ways", are
> multiplicative.
>
> Hans Christian von Baeyer
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### [Fis] Logarithm

```For entropy we do need the log, because the chemists already knew that it
is additive, whereas probability and "the number of ways", are
multiplicative.

Hans Christian von Baeyer
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### [Fis] No, this is not the reason.

```Dear Sung,

You wrote:
> I think the main reason that we express 'information'  as a logarithmic
function of the number of choices available, n, may be because the human
brain finds it easier to remember (and communicate and reason with)  10
than  100, or 100 than 10. . . . 0, etc.
>

No, this is not the reason.
The correct answer is that Shannon assume the n=0 as possible !!!
Because of this, to avoid dividing by zero he used log(s).
But this is impossible and many years the world works with log(s) not
understanding why !

log(s) is(are) no needed.

It is more clear and easy to work without log(s) :=)

Friendly greetings
Krassimir

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] If always n>0 why we need log

```Hi Krassimir,

I think the main reason that we express 'information'  as a logarithmic
function of the number of choices available, n, may be because the human brain
finds it easier to remember (and communicate and reason with)  10 than
100, or 100 than 10. . . . 0, etc.

All the best.

Sung

From: Krassimir Markov
Sent: Sunday, June 3, 2018 12:06 PM
To: Foundation of Information Science
Cc: Sungchul Ji
Subject: If always n>0 why we need log

Dear Sung,

A simple question:

I = -log_2(m/n) = - log_2 (m) + log_2(n)   (1)

Friendly greetings

Krassimir

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] If always n>0 why we need log

```Hi Krassimir,

I think the main reason that we express 'information'  as a logarithmic
function of the number of choices, n, may be because the human brain finds it
easier to remember (and communicate and reason with)  10 than  100, or
100 than 10. . . . 0, etc.

All the best.

Sung

From: Krassimir Markov
Sent: Sunday, June 3, 2018 12:06:54 PM
To: Foundation of Information Science
Cc: Sungchul Ji
Subject: If always n>0 why we need log

Dear Sung,

A simple question:

I = -log_2(m/n) = - log_2 (m) + log_2(n)   (1)

Friendly greetings

Krassimir

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### [Fis] If always n>0 why we need log

```Dear Sung,

A simple question:

I = -log_2(m/n) = - log_2 (m) + log_2(n)   (1)

Friendly greetings

Krassimir

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] The information-entropy relation clarified: The New Jerseyator

```Hi FISers,

I found a typo in the legend to Figure 1 in my last post:  ".. . .  without
energy dissipation, no energy, . . ." shoud read

"Without energy dissipation, no information."
(4).

In fact, Statement (4) may be fundamental to informatics in general so that it
may be referred to as the "First Principle of Informatics" (FPI).

If this conjecture is correct, FPI may apply to the controversioal
interpretations of the wavefunction of a material system (WFMS), since WFMS is
supposed to encode all the information we have about the material system under
consideration and hence implicates "information".  It thus seems to me that the
complete interpretation of a wavefucntion, according to FPI, must specify the
selection process as well, i.e., the free energy-dissipating step, which I am
tempted to identified with "measurement" , "quantum jump", or "wavefunction
collapse".

I am not a quantum mechanician, so it is possible that I have committed some
logical errors somewhere in my arguemnt above.  If you diectect any, please let
me know.

With All the best.

Sung

From: Fis  on behalf of Sungchul Ji

Sent: Sunday, June 3, 2018 12:13:11 AM
To: 'fis'
Subject: [Fis] The information-entropy relation clarified: The New Jerseyator

Hi  FISers,

One simple (and may be too simple) way to distinguish between information and
entropy may be as follows:

(i)  Define  information (I) as in Eq. (1)

I = -log_2(m/n) = - log_2 (m) + log_2(n)
(1)

where n is the number of all possible choices (also called variety) and m is
the actual choices made or selected.

(ii) Define the negative binary logarithm of n, i.e., -log_2 (n), as the
'variety' of all possible choices  and hence identical with Shannon entropy H,
as suggested by Wicken [1].  Then Eq. (1) can be re-writtens as Eq. (2):

I = - log_2(m) - H
(2)

(iii) It is evident that when m = 1 (i.e., when only one is chosen out of all
the variety of choices available) , Eq. (2) reduces to Eq. (3):

I = - H
(3)

(iv) As is well known, Eq. (3) is the basis for the so-called the "negentropy
priniciple of Information" frist advocated by Shroedinger followed by
Brillouin,and others.  But Eq. (3) is clearly not a principle but a special
case of Eq. (2)  with m = 1.

(v)  In conlcusion, I claim that information and negative entropry are not the
same qualitatively nor quantiatively (except when m = 1 in Eq. (2)) and
represent two opposite nodes of a fundamental triad [2]:

Selection

H
>  I
(uncertainty before selection)
(Uncertainty after selection)

Figure 1.  The New Jerseyator model of information (NMI) [3].  Since selection
requires free energy dissipation, NMI implicates both information and energy.
That is, without energy dissipation, no energy, and hence NMI may be viewed as
a self-organizing process (also called dissipative structure) or an ‘-ator’.
Also NMI is consistent with “uncertainty reduction model of information.”

With all the best.

Sung

P.s.  There are experimetnal evidences that informattion and entropy are
orthogonal, thus giving rise to the Planck-Shannon plane that has been shown to
distiguish between cancer and healthy cell mRNA levels.  I will discus this in
n a later post.

References:

[1]  Wicken, J. S. (1987).  Entropy and information: suggestions for common
language. Phil. Sci. 54: 176=193.
[2] Burgin, M (2010).  Theory of Information: Funadamentality, Diversity,
and Unification.  World Scientific Publishing, New Jersey,

[3] Ji, S. (2018).  The Cell Langauge theory: Connecting Mind and Matter.
World Scientific Publishing, New Jersey.  Figure 10.24.
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```