>The other obvious answer is entropy, and its relative, the cross-entropy >between two random variables. The introductory probability text by Ross >(Prentice-Hall 1998) has a nice discussion of entropy and Shannon >information. My favorite discussion of entropy and cross-entropy is in >Bernardo and Smith, *Bayesian Theory* (Wiley 1994). However, it turns out >these are value-related. I find it hard to envision a notion of value of information that is not somehow value related. I'd be interested in anyone who thinks he/she can. Kathy Laskey
- ?Value of information without utilities? Marek J. Druzdzel
- Re: ?Value of information without utilities? Finn V. Jensen
- Re: ?Value of information without utilities? Gordon Hazen
- Re: ?Value of information without utilities? Kathryn Blackmond Laskey
- Re: ?Value of information without utilit... Russell Almond
- Re: ?Value of information without utilities? Judea Pearl
- Re: ?Value of information without utilities? Bob Welch
- Re: ?Value of information without utilities? Dr. Lian Wen Zhang
- Re: ?Value of information without utilit... Peter Szolovits
- Re: ?Value of information without utilities? Bob Welch
- Re: ?Value of information without utilities? Bob Welch
- Re: ?Value of information without utilit... Rina Dechter
- Re: ?Value of information without utilit... Marco Valtorta
