"Aleks Jakulin" <jakulin@@ieee.org> wrote in message news:<[EMAIL PROTECTED]>...
> "lucas" <[EMAIL PROTECTED]> wrote:
> > a simple question about mutual information:
> >
> > mutual information is defined as I(X;Y) = H(X) - H(X|Y)
> > If X and Y aren't simple discrete random variables, but are defined
> > set of random discrete variables (X,Y in {X1,X2, ....,Xn}, with Xi
> > and Yi discrete random variables), how can i calculate I(X;Y)?
>
> I(ABC;XYZ) is a well-defined concept in information theory. For
> example, if your X is actually a set of variables {X1,X2,X3}, and your
> Y is a set of variables {Y1,Y2}, you could define X' to be the
> Cartesian product of variables in X: X1xX2xX3, and similarly Y' to be
> Y1xY2. Then, you compute I(X';Y'). The trouble with this approach is
> that the resulting Cartesian products may be sparse, and then your
> probability estimates are bad.
Why probability estimates are bad?
> Here, you may use the interaction
> analysis approach described in http://arxiv.org/abs/cs.AI/0308002, and
> approximate I(X';Y') using lower-order interactions.
How works this approach in a few words?
Thanks
.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
. http://jse.stat.ncsu.edu/ .
=================================================================