While discretization is doable for known distributions, it is problematic
for variables whose posterior distributions is not known and may depend on
the evidence. One alternative is not to discretize continuous variables, but
to approximate continuous densities with mixtures of truncated exponentials
(MTE). One can get good approximations with few components and one can
propagate MTE potentials exactly using the Shenoy-Shafer architecture. See
working paper below for use of MTE potentials in modeling conditional
Gaussian distributions.

Cobb, B. R. and P. P. Shenoy, "Inference in Hybrid Bayesian Networks with
Mixtures of Truncated Exponentials," Working Paper No. 294, June 2003,
School of Business, University of Kansas. Can download pdf version from:
<http://lark.cc.ku.edu/~pshenoy/Papers/WP294.pdf>

Prakash Shenoy
- --
> From: Mi Hyun Park <[EMAIL PROTECTED]>
> Date: Thu, 04 Sep 2003 11:27:23 -0700
> To: [EMAIL PROTECTED]
> Subject: [UAI] Discretization (or quantization) methods of data
> 
> Dear UAI Collegues,
> 
> I am working with Bayesian networks in image processing. Would you give me
> some help on data discretization (or quantization) methods? The methods
> I've used were equal interval, equal frequency, and standard deviation for
> discretization. It would be very helpful if somebody gives me references
> on this matter and other proposed methodologies (or free software for
> test). If there is an outstanding or recommended method, please let me
> know.
> 
> Thank you,
> 
> Mi-Hyun


Reply via email to