>The other obvious answer is entropy, and its relative, the cross-entropy
>between two random variables.  The introductory probability text by Ross
>(Prentice-Hall 1998) has a nice discussion of entropy and Shannon
>information.  My favorite discussion of entropy and cross-entropy is in
>Bernardo and Smith, *Bayesian Theory* (Wiley 1994).  However, it turns out
>these are value-related.

I find it hard to envision a notion of value of information that is not
somehow value related.  I'd be interested in anyone who thinks he/she can.

Kathy Laskey

Reply via email to