I am also having problems with triangle count - seems like this algorithm
is very memory consuming (I could not process even small graphs ~ 5 million
Vertices and 70 million Edges with less the 32 GB RAM on EACH machine).
What if I have graphs with billion edges, what amount of RAM do I need then?
Hi there,
I got an error when running one simple graphX program.
My setting is: spark 1.4.0, Hadoop yarn 2.5. scala 2.10. with four virtual
machines.
if I constructed one small graph (6 nodes, 4 edges), I run:
println("triangleCount: %s ".format(
hdfs_graph.triangleCount().vertices.count() ))