Hello, I haven't had any problems with running the algorithm on a random Erdos-Renyi graph with 100K vertices and 500K edges. I also tried it with a geometric random graph of roughly the same size (100K vertice,s 600K edges) and that one worked as well. Also, I only have 4 GB of RAM and the RAM usage did not climb higher than a couple of hundreds of megabytes so this should definitely be possible.
One possibility that I can think of right now is that you are running a 32-bit OS or a 32-bit version of Python on a 64-bit OS so Python cannot utilize the entire 6 GB of memory you have due to its limited address space. But even that couldn't be the cause of the problem because in my case the memory usage stayed pretty low. Cheers, T. On 10/17, Agha Hashmi wrote: > Hey Everyone, > I am working on igraph python version to analyse > different community detection algorithms on bigger datasets (100 thousand > nodes, 500 thousand edges), but it always gives me memory error saying > ''memory failure or the process is terminated''. I am working with core i7 > process and 6GB RAM. Is it the memory problem or any other reason. I have > not worked on bigger graphs before, so I would be helpful if anyone can > recommend me the fastest way of analysing social graphs; can it be > efficiently done on laptop or I need to run it on some server? I am pasting > my code for the graph below. > > here is the code below > ############FastGreedy Algorithm########### > fastgreedy=g.community_fastgreedy() > fast=fastgreedy.as_clustering() > > > > > cheers, > sam > _______________________________________________ > igraph-help mailing list > [email protected] > https://lists.nongnu.org/mailman/listinfo/igraph-help -- T. _______________________________________________ igraph-help mailing list [email protected] https://lists.nongnu.org/mailman/listinfo/igraph-help
