Am 15.01.20 um 11:22 schrieb dawe:
> I have a question related to this
> The documentation example suggests a hierarchy set to 10 levels
> 
> bs = state.get_bs()                     # Get hierarchical partition.
> bs += [np.zeros(1)] * (10 - len(bs))    # Augment it to L = 10 with
>                                        # single-group levels.
> 
> state = state.copy(bs=bs, sampling=True)
> 
> Is there some golden rule (which I obviously don’t know) to choose such
> dimension? Is 10 always a good choice? More important: why I need to modify
> the length before mcmc_equilibrate()?

The number of layers used is an *upper bound* only, which should be
sufficiently large to accommodate your posterior distribution. The
algorithm will decide automatically how many layers should in fact be used.

Since the number of nodes tend to decay exponentially in the upper
levels, a value of 10 is often quite enough. For very large networks
this can be increased to 15 or 20, but rarely we need more than this.


-- 
Tiago de Paula Peixoto <[email protected]>

Attachment: signature.asc
Description: OpenPGP digital signature

_______________________________________________
graph-tool mailing list
[email protected]
https://lists.skewed.de/mailman/listinfo/graph-tool

Reply via email to