many partitions does the experiment have for
uk-2007-05 dataset. I tried 16, 192partitions, and both are sucked.
原始邮件
发件人:Ted yuyuzhih...@gmail.com
收件人:txw...@outlook.com
抄送:useru...@spark.apache.org
发送时间:2015年1月16日(周五) 02:23
主题:Re: Is spark suitable for large scale pagerank, such as 200 million
Hi,
I am run PageRank on a large dataset, which include 200 million nodes and 2
billion edges?
Isspark suitable for large scale pagerank? How many cores and MEM do I need and
how long will it take?
Thanks
Xuewei Tang