No need to propagate - just find the row of the cp-table of the next node
corresponding to the values of the parents already selected.
This makes a couple of assumptions: (1) You still have the original
directed graph representation and it is acyclic, (2) your
instantiation ordering in compatable with the partial ordering defined
by the ADG.
The algorithm I proposed is less efficient, but should be compatable
with most software (even if you've already transformed to a junction
tree representation).
There is also a slightly more efficient version in the Junction tree.
You propogate one way (to the root node). You then instatiate the
root node, propagate that out and instatiate the unstiantiated
variables at each node as you visit it.
--Russell
>
> I'm working on an algorithm for learning the structure of a bayesian
> network from data using a bayesian approach. In order to evaluate my
> algorithm, I'm looking for a free and simple program that create a
database
> of cases from a given structure. Once the database detected, I will run
my
> algorithm on it and compare the initial structure with the structure
detected.
>
>
>The algorithim is so simple that any Bayes net propagation engine could in
>principle do it. (If it comes as a shared library, it is relatively
>simple to write a wrapper program to do this. I did it recently using
>the Ergo DLL.) First, select a node and get its marginal
>distribution, randomly select a value according to that distribution.
>Next, set that node to the randomly selected value and propagate. Now
>choose another node, and find its marginal distribution (conditioned
>on the value of the first node). Randomly choose its value according
>to that conditional distribution. When you have instantiated all of
>the nodes, you will have sampled a value from this Bayes net.
------- End of Forwarded Message