"Vimal" wrote:
> Can someone explain or give some urls on 'difference' between
> expectation-maximization and entropy maximization.
>
> To me both seems to maximize the E(log(p(x))) where p(x) is the pdf,
> although both originate from different theories.

Watch out. Maximizing log-likelihood (and EM is a particular approach
to this) is similar to *minimizing* the cross or relative entropy
(Kullback-Leibler divergence).

Maximum entropy principle is about the idea that given a number of
equivalent prospective models, you should pick the one with the
highest entropy.

-- 
mag. Aleks Jakulin
http://ai.fri.uni-lj.si/aleks/
Artificial Intelligence Laboratory,
Faculty of Computer and Information Science, University of Ljubljana.


.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to