>The other obvious answer is entropy, and its relative, the cross-entropy
  >between two random variables.  The introductory probability text by Ross
  >(Prentice-Hall 1998) has a nice discussion of entropy and Shannon
  >information.  My favorite discussion of entropy and cross-entropy is in
  >Bernardo and Smith, *Bayesian Theory* (Wiley 1994).  However, it turns out
  >these are value-related.

  I find it hard to envision a notion of value of information that is not
  somehow value related.  I'd be interested in anyone who thinks he/she can.

  Kathy Laskey

I.J. Good used the notion of a Quasi-Utility to describe some of these
information theoretic measures.  David Madigan put a review of some of
that work into:

\refrence {Madigan, D. and R.G. Almond [1996]} ``Test
Selection Strategies for Belief Networks'' In D. Fisher and H-J Lenz
(eds.) {\it Learning from Data:  AI and Statistics IV\/}  Springer
Verlag, New York, 89--98.

(I think that it might also still be available as a Tech Report from
http://www.stat.washington.edu/
)
        --Russell

Reply via email to