Hi,

I try to create a very large graph using random_graph function. My computer
has 128 GB physical memory and 16 GB swap partition. When I try to create a
graph  with 50 million for number of vertices, and 80 for both numbers of
incoming and outgoing edges, it consumes all memory space and then crashes.

Do you have any idea how I can overcome the memory limit? Is there any way
to make random_graph function more memory efficient?

Thanks,
Arash



--
View this message in context: 
http://main-discussion-list-for-the-graph-tool-project.982480.n3.nabble.com/Memory-problem-tp4024963.html
Sent from the Main discussion list for the graph-tool project mailing list 
archive at Nabble.com.
_______________________________________________
graph-tool mailing list
[email protected]
http://lists.skewed.de/mailman/listinfo/graph-tool

Reply via email to