Also, re the original question: there is no built-in constraint for
the sizes of the graphs apart from the fact that vertex and edge IDs
are typically stored as long integers, so you won't be able to have
more than 2 billion vertices or edges.  But, if you have that many
vertices or edges then you probably have bigger problems anyway ;)
T.


On Mon, Apr 25, 2016 at 11:11 AM, Tamas Nepusz <[email protected]> wrote:
> Hi,
>
> Storing this graph should not be a problem for igraph. As for further
> analysis - it depends on what you want to do with it. Some algorithms
> might work while others would be infeasible to try; for instance, edge
> betweenness community detection probably wouldn't work, but the
> Louvain method (a.k.a. multilevel community detection) probably would
> (given enough time).
>
> All the best,
> T.
>
>
> On Sun, Apr 24, 2016 at 4:41 AM, Pablo Moriano <[email protected]> 
> wrote:
>> Hi,
>>
>> I was wondering if there exist any constraint with the size of a graph in 
>> Python graph. I need to manipulate a graph of approximately 21 M nodes and 
>> 60 M edges. I could assign until 64gb of RAM to this task. Thank you.
>> _______________________________________________
>> igraph-help mailing list
>> [email protected]
>> https://lists.nongnu.org/mailman/listinfo/igraph-help

_______________________________________________
igraph-help mailing list
[email protected]
https://lists.nongnu.org/mailman/listinfo/igraph-help

Reply via email to