Dear AI colleagues,

Does any one know which of Eric Horvitz's papers talks about
"non-monotonic entropy change", as described by Robert Dodier below.

Thanks for any information.

Yongmian Zhang
 


On Mon, 29 Sep 2003, Robert Dodier wrote:
> 
> Just a footnote on this topic -- it's not the case that additional
> evidence (nodes with assigned values) always decreases the entropy
> of the joint distribution modeled by the belief network. For example,
> suppose the prior for a binary node is heavily weighted toward 0 or 1,
> yet the likelihood function is the other way around; then the posterior
> will be more or less balanced across 0 and 1, and the posterior will
> have greater entropy than the prior.
> 
> There was a discussion of this non-monotonic entropy change on this
> list a few years ago. Eric Horwitz pointed out that while entropy
> change is not necessarily monotonic, expected gain of decisions made
> w.r.t. the joint distribution is not decreasing (if I've remembered
> this correctly). Eric referred to a paper he wrote in which that was
> proved. Sorry, I don't have a reference.
> 
> For what it's worth,
> Robert Dodier
> 

Reply via email to