On 26.03.2018 12:49, Katharina Baum wrote:
> I currently started using graph-tool: nice software, excellent
> documentation, thank you!
> 
> However, I stumbled on some (for me) unexpected behavior when using the
> get_edges_prob() function with a BlockState of a weighted network
> (graph-tool version 2.26, Python 2.7 as well as Python 3.6). When calling
> the get_edges_prob() function to a state, its entropy is altered, and
> subsequent calls of get_edges_prob() deliver different results.
> 
> "Luckily", I could reproduce the observed behavior with a dataset from the
> graph-tool collection (with arguably small alterations, but the introduced
> differences are a lot bigger in my networks).
> 
> import graph_tool as gt
> import graph_tool.collection as gtc
> import graph_tool.inference as gti
> 
> g=gtc.data["celegansneural"]
> state=gti.minimize_blockmodel_dl(g,state_args=dict(recs=[g.ep.value],rec_types=['real-normal']))
> 
> original_entropy=state.entropy()
> edge_prob=[]
> for i in range(10000):
>         edge_prob.append(state.get_edges_prob(missing=[],spurious=[(0,2)]))
> 
> original_entropy
> state.entropy() #entropy is different from original!
> edge_prob[0] #first call of get_edges_prob() delivers other results than last
> edge_prob[-1]
> 
> For me, this is really unexpected. What is happening there, and/or how this
> can be fixed?
> Smaller further experiments showed that this also happens to
> NestedBlockStates (of course), but seems not to happen for models lacking
> edge covariates...

This seems indeed like a bug. I suspect it has to do with the "real-normal"
model. Can you open an issue in the website with the above example so I can
keep track of this and fix it?

Could you also test this with "real-exponential" instead of "real-normal"
and also with the current git version?

Best,
Tiago

-- 
Tiago de Paula Peixoto <[email protected]>
_______________________________________________
graph-tool mailing list
[email protected]
https://lists.skewed.de/mailman/listinfo/graph-tool

Reply via email to