Re: [Fis] No, this is not the reason

2018-06-03 Thread Joseph Brenner
The utility of logarithms? I've always thought using a slide rule was more
esthetic than pushing buttons on a calculator. Perhaps few people still know
what a slide rule can do. Of course, the result might be a little less
accurate . . .

Best,

Joseph 
 
-Original Message-
From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Mark Johnson
Sent: dimanche, 3 juin 2018 22:07
To: Krassimir Markov
Cc: Foundation of Information Science; Sungchul Ji
Subject: Re: [Fis] No, this is not the reason.

Dear Krassimir and Sungchul,

I suppose this bears out Von Neumann's tongue-in-cheek advice to
Shannon! (http://www.eoht.info/page/Neumann-Shannon+anecdote)

Krassimir, just to ask about Boltzmann's use of the logs... I first
understood this to be a measure of the probability distribution of a
whole thermodynamic system which factorises into the product of
probabilities of microstates in the system. Hence the logs (and hence
Shannon equating of "microstate" for "alphabet", which seems
reasonable at first glance)...

EXCEPT I very much like the explanation that Bob Ulanowicz gives here
(in http://www.mdpi.com/2078-2489/2/4/624) - which doesn't mention the
factorising of the probabilities of microstates, but instead argues
that -log (p(i)) gives a value for what isn't there (the "apophatic",
"absence"), and Bob's criticism of Shannon for inverting this by
turning his H into a measure of surprise:

"Boltzmann described a system of rarefied, non-interacting particles
in probabilistic fashion. Probability theory quantifies the degree to
which state i is present by a measure, p(i). Conventionally, this
value is normalized to fall between zero and one by dividing the
number of times that i has occurred by the total number of
observations. Under this "frequentist" convention, the probability of
i not occurring becomes (1 - p(i)). Boltzmann's genius, however, was
in abjuring this conventional measure of non-occurrence in favor of
the negative of the logarithm of pi.  (It should be noted that
-log(p(i)) and (1 - p(i)) vary in uniform fashion, i.e., a one-to-one
relationship between the two functions exists). His choice imposed a
strong asymmetry upon matters. Conventionally, calculating the average
nonbeing in the system using (1 - p(i)) results in the symmetrical
parabolic function (p(i) - p(i)^2). If, however, one calculates
average absence using Boltzmann's measure, the result becomes skewed
towards smaller p(i) (or larger [1 - p(i)]), i.e., towards nonbeing."

It's such a useful equation, and I agree, "Why are the logs there?" is
an important question.

Best wishes,

Mark

On 3 June 2018 at 20:22, Krassimir Markov  wrote:
> Dear Sung,
>
> You wrote:
>> I think the main reason that we express 'information'  as a logarithmic
> function of the number of choices available, n, may be because the human
> brain finds it easier to remember (and communicate and reason with)  10
> than  100, or 100 than 10. . . . 0, etc.
>>
>
> No, this is not the reason.
> The correct answer is that Shannon assume the n=0 as possible !!!
> Because of this, to avoid dividing by zero he used log(s).
> But this is impossible and many years the world works with log(s) not
> understanding why !
>
> log(s) is(are) no needed.
>
> It is more clear and easy to work without log(s) :=)
>
> Friendly greetings
> Krassimir
>
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis



-- 
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


---
L'absence de virus dans ce courrier électronique a été vérifiée par le logiciel 
antivirus Avast.
https://www.avast.com/antivirus


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] No, this is not the reason.

2018-06-03 Thread Mark Johnson
Dear Krassimir and Sungchul,

I suppose this bears out Von Neumann's tongue-in-cheek advice to
Shannon! (http://www.eoht.info/page/Neumann-Shannon+anecdote)

Krassimir, just to ask about Boltzmann's use of the logs... I first
understood this to be a measure of the probability distribution of a
whole thermodynamic system which factorises into the product of
probabilities of microstates in the system. Hence the logs (and hence
Shannon equating of "microstate" for "alphabet", which seems
reasonable at first glance)...

EXCEPT I very much like the explanation that Bob Ulanowicz gives here
(in http://www.mdpi.com/2078-2489/2/4/624) - which doesn't mention the
factorising of the probabilities of microstates, but instead argues
that -log (p(i)) gives a value for what isn't there (the "apophatic",
"absence"), and Bob's criticism of Shannon for inverting this by
turning his H into a measure of surprise:

"Boltzmann described a system of rarefied, non-interacting particles
in probabilistic fashion. Probability theory quantifies the degree to
which state i is present by a measure, p(i). Conventionally, this
value is normalized to fall between zero and one by dividing the
number of times that i has occurred by the total number of
observations. Under this “frequentist” convention, the probability of
i not occurring becomes (1 − p(i)). Boltzmann’s genius, however, was
in abjuring this conventional measure of non-occurrence in favor of
the negative of the logarithm of pi.  (It should be noted that
−log(p(i)) and (1 − p(i)) vary in uniform fashion, i.e., a one-to-one
relationship between the two functions exists). His choice imposed a
strong asymmetry upon matters. Conventionally, calculating the average
nonbeing in the system using (1 − p(i)) results in the symmetrical
parabolic function (p(i) − p(i)^2). If, however, one calculates
average absence using Boltzmann’s measure, the result becomes skewed
towards smaller p(i) (or larger [1 − p(i)]), i.e., towards nonbeing."

It's such a useful equation, and I agree, "Why are the logs there?" is
an important question.

Best wishes,

Mark

On 3 June 2018 at 20:22, Krassimir Markov  wrote:
> Dear Sung,
>
> You wrote:
>> I think the main reason that we express 'information'  as a logarithmic
> function of the number of choices available, n, may be because the human
> brain finds it easier to remember (and communicate and reason with)  10
> than  100, or 100 than 10. . . . 0, etc.
>>
>
> No, this is not the reason.
> The correct answer is that Shannon assume the n=0 as possible !!!
> Because of this, to avoid dividing by zero he used log(s).
> But this is impossible and many years the world works with log(s) not
> understanding why !
>
> log(s) is(are) no needed.
>
> It is more clear and easy to work without log(s) :=)
>
> Friendly greetings
> Krassimir
>
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis



-- 
Dr. Mark William Johnson
Institute of Learning and Teaching
Faculty of Health and Life Sciences
University of Liverpool

Phone: 07786 064505
Email: johnsonm...@gmail.com
Blog: http://dailyimprovisation.blogspot.com

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis