On Fri, Jul 15, 2011 at 6:47 PM, Ted Dunning <[email protected]> wrote:
> Sort of.  It would be Shannon entropy if the sum x_i = 1.

Right, yes that's why one would divide by N = sum(x) to make that so.


>> But what it computes now is the sum of -x * log(x/N). Seems like a bit

My question was what this would be the entropy of, but I think you hit
on this shortly below -- it's not necessarily entropy, just looks like
it. In any event this is just N times the normalized entropy if I
understand correctly, just a constant factor different.


> This means that the maximum log likelihood is
>
>      max_pi p(K | \vec \pi) = \sum k_i \log (k_i / N) + log Z
>
> The log-likelihood ratio involves three such expressions.
>
> The similarity to Shannon entropy here is either very deep or coincidental,
> depending on the day of the week.

That makes sense. It isn't necessarily entropy that this is
calculating, but something entropy-shaped that falls out of the max
likelihood for this multinomial distribution.

Reply via email to