Dear Colleagues,

I am in need for a "divergence metric" that would render
Bayesian updating a continuous operator; i.e.,

if D(p,q) is the divergence between distributions p and q,
and p^e is the result of updating p with evidence e:

  p^(e)[i] = p[i] P(e|i) / \sum_j p[j] P(e|j)

where P(.|.) doesn't depend on p. Then, I would like to have

  (1) D(p^e,q^e) <= K*D(p,q)

where K is a constant (preferable less than one).
In addition, I need that D(.,.) to be a metric in the space
of distributions; i.e,

  (2) D(p,q) = 0 iff p = q,
  (3) D(p,q) = D(q,p),
  (4) D(p,q) <= D(p,r) + D(q,r)


It is easy to see (with a counterexample) that the symmetric
Kullback-Leibler divergence doesn't satisfy (1).

I wonder if such D(.,.) exists.


Thanks in advance


Blai Bonet

Reply via email to