Am 09.11.18 um 16:30 schrieb Tzu-Chi Yen:
> Now I understand that `edges_dl` specifically encodes the flat prior. I have
> 2 following questions: 
> 
> 🤔- How could I access the terms in Eq.(41) of the PRE paper, i.e. each term
> is the level-wise entropy of edge counts, as Eq.(42) describes?

These are given by the different hierarchy levels, level_entropy(1),
level_entropy(2), etc.

> For the "lesmis" dataset, the bottom-most layer has the entropy:
>>> nested_state.level_entropy(0)
> Out[•]: 630.133156768878
> 
> This is exactly the sum of these three entropic terms: "adjacency"
> (332.24632), "degree_dl" (170.10951), and "partition_dl" (127.77732). I
> could not find a rationale about the missing entropy for edge counts.

This is given by the upper layers, as answered above.

> 🤔- I found that `nested_state.levels[0].entropy(deg_entropy=True) -
> nested_state.levels[0].entropy(deg_entropy=False) < 0`. This command is
> expected to print the negative logarithm of Eq.(28) of the paper, which is
> positive. I am not sure what went wrong.

No, 'deg_entropy` controls the degree part of the likelihood, not the prior.
The parameter you want is `degree_dl`.

Best,
Tiago

-- 
Tiago de Paula Peixoto <[email protected]>

Attachment: signature.asc
Description: OpenPGP digital signature

_______________________________________________
graph-tool mailing list
[email protected]
https://lists.skewed.de/mailman/listinfo/graph-tool

Reply via email to