Hi,
I'm running a Giraph PageRank job. I tried with 8GB input text data
over 10 nodes (each has 4 core, 4 disks, and 12GB physical memory), that
is 800MB input-data/machine. However, Giraph job fails because of high
GC costs and Out-of-Memory exception.
Do I set some special things in Hadoop configurations, for example,
maximum heap size for map task vm ?
Thanks!!Best regards, Yingyi
