Re: [Fis] If always n>0 why we need log

2018-06-03 Thread Terrence W. DEACON
I am in agreement with Guy Hoelzer in his assessment of the use of log-transformed data. Since I regularly deal with biological growth processes, using log-transformed data is the clearest way to anaylyze proportional relationships in nonlinear sysrtems. By virtue of the way it compresses

Re: [Fis] If always n>0 why we need log

2018-06-03 Thread Guy A Hoelzer
Dear Sung et al., I appreciate human bias in terms of numerical scale, but I don’t think that is what we actually achieve by using logarithms. If the universe of possibility is fractal, using a logarithm does not eliminate the problem of large numbers. I think the primary outcome achieved by

Re: [Fis] No, this is not the reason.

2018-06-03 Thread Mark Johnson
Dear Krassimir and Sungchul, I suppose this bears out Von Neumann's tongue-in-cheek advice to Shannon! (http://www.eoht.info/page/Neumann-Shannon+anecdote) Krassimir, just to ask about Boltzmann's use of the logs... I first understood this to be a measure of the probability distribution of a

Re: [Fis] If always n>0 why we need log

2018-06-03 Thread Pedro C. Marijuan
Sorry Sung, you know about the rules of engagement in this list... you have gone to 5 msgs. And that means one and half weeks of sanction. Even more after having warned you privately several times. Anyhow, tomorrow I will make public an embarrassing bureaucratic procedure that the list has to

Re: [Fis] Logarithm

2018-06-03 Thread Karl Javorszky
For establishing the upper limit of the maximal number of commutative groups on sets, the logarithm well pictures the decreasing probabilities of finding a new constellation of symbols, by the ever increasing number of factors in the divisor. Hans von Baeyer schrieb am So., 3. Juni 2018 21:53:

[Fis] Logarithm

2018-06-03 Thread Hans von Baeyer
For entropy we do need the log, because the chemists already knew that it is additive, whereas probability and "the number of ways", are multiplicative. Hans Christian von Baeyer ___ Fis mailing list Fis@listas.unizar.es

[Fis] No, this is not the reason.

2018-06-03 Thread Krassimir Markov
Dear Sung, You wrote: > I think the main reason that we express 'information' as a logarithmic function of the number of choices available, n, may be because the human brain finds it easier to remember (and communicate and reason with) 10 than 100, or 100 than 10. . . . 0,

Re: [Fis] If always n>0 why we need log

2018-06-03 Thread Sungchul Ji
Hi Krassimir, I think the main reason that we express 'information' as a logarithmic function of the number of choices available, n, may be because the human brain finds it easier to remember (and communicate and reason with) 10 than 100, or 100 than 10. . . . 0, etc.

Re: [Fis] If always n>0 why we need log

2018-06-03 Thread Sungchul Ji
Hi Krassimir, I think the main reason that we express 'information' as a logarithmic function of the number of choices, n, may be because the human brain finds it easier to remember (and communicate and reason with) 10 than 100, or 100 than 10. . . . 0, etc. All the

[Fis] If always n>0 why we need log

2018-06-03 Thread Krassimir Markov
Dear Sung, A simple question: If always n>0 why we need log in I = -log_2(m/n) = - log_2 (m) + log_2(n) (1) Friendly greetings Krassimir ___ Fis mailing list Fis@listas.unizar.es http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Re: [Fis] The information-entropy relation clarified: The New Jerseyator

2018-06-03 Thread Sungchul Ji
Hi FISers, I found a typo in the legend to Figure 1 in my last post: ".. . . without energy dissipation, no energy, . . ." shoud read "Without energy dissipation, no information." (4). In