Julian Russell wrote:

> Does it make sense to calculate the entropy of a node in a Bayes net by
> using Shannons communication entropy equation? Can a node be considered a
> 'communication channel' for this purpose? Following on from this would it be
> reasonable to measure the change in entropy as more data is added to the
> decision model, thereby decreasing the uncertainty/risk by defining the node
> with more certainty, as measured by the progressively decreasing entropy?

Just a footnote on this topic -- it's not the case that additional
evidence (nodes with assigned values) always decreases the entropy
of the joint distribution modeled by the belief network. For example,
suppose the prior for a binary node is heavily weighted toward 0 or 1,
yet the likelihood function is the other way around; then the posterior
will be more or less balanced across 0 and 1, and the posterior will
have greater entropy than the prior.

There was a discussion of this non-monotonic entropy change on this
list a few years ago. Eric Horwitz pointed out that while entropy
change is not necessarily monotonic, expected gain of decisions made
w.r.t. the joint distribution is not decreasing (if I've remembered
this correctly). Eric referred to a paper he wrote in which that was
proved. Sorry, I don't have a reference.

For what it's worth,
Robert Dodier


Reply via email to