Still has some overhead, the wiki dataset I benched were just 1gb. Wonder where our hotspots are, did you get your yourkit license to profile?
2012/8/30 Edward J. Yoon <[email protected]> > My settings and envs were like this: > > * child.java.opts = -Xmx1024m > * tasks.num per node = 5 > * total tasks = 85 > * network env = infiniband 40G > * physical nodes = 17 > * hadoop-cdh-0.20 > * zookeeper-cdh > > Finished in 127 secs. > > On Thu, Aug 30, 2012 at 10:27 PM, Thomas Jungblut > <[email protected]> wrote: > > Yeah, lots of ram ;)) > > > > 2012/8/30 Edward J. Yoon <[email protected]> > > > >> > It runs locally in under 120s on my computer > >> > >> Haha, single machine? > >> > >> On Thu, Aug 30, 2012 at 10:25 PM, Thomas Jungblut > >> <[email protected]> wrote: > >> > It runs locally in under 120s on my computer so I assume that this is > >> > correct. > >> > > >> > 2012/8/30 Edward J. Yoon <[email protected]> > >> > > >> >> Just two minutes were needed to process wikipedia dataset. Is this > >> normal? > >> >> > >> >> On Tue, Jul 10, 2012 at 8:17 PM, Thomas Jungblut > >> >> <[email protected]> wrote: > >> >> > http://snap.stanford.edu/data/ > >> >> > > >> >> > 2012/7/10 Edward J. Yoon <[email protected]> > >> >> > > >> >> >> Just wondering, do you know more large and downloadable data set? > >> >> >> > >> >> >> On Tue, Jul 10, 2012 at 8:01 PM, Edward J. Yoon < > >> [email protected]> > >> >> >> wrote: > >> >> >> > Oh... thanks. now I can guess what it is. > >> >> >> > > >> >> >> > On Tue, Jul 10, 2012 at 7:57 PM, Thomas Jungblut > >> >> >> > <[email protected]> wrote: > >> >> >> >> Yes, the example graph is not complete thus resulting in NPEs. > >> >> >> >> > >> >> >> >> 2012/7/10 Edward J. Yoon <[email protected]> > >> >> >> >> > >> >> >> >>> Nope, should I set to true? > >> >> >> >>> > >> >> >> >>> On Tue, Jul 10, 2012 at 7:52 PM, Thomas Jungblut > >> >> >> >>> <[email protected]> wrote: > >> >> >> >>> > Set "hama.graph.repair" to true in conf? > >> >> >> >>> > > >> >> >> >>> > 2012/7/10 Edward J. Yoon <[email protected]> > >> >> >> >>> > > >> >> >> >>> >> I installed Hama 0.5 on 4 racks (4 x 18 nodes) w/ 3 tasks > per > >> >> each > >> >> >> >>> >> node (bsp.child.java.opts = Xmx3512m), tried to run > PageRank > >> on > >> >> >> >>> >> Wikipedia dataset introduced at > >> >> >> >>> >> http://wiki.apache.org/hama/WriteHamaGraphFile > >> >> >> >>> >> > >> >> >> >>> >> Doesn't work, the cause of job fail error is different > every > >> >> time. > >> >> >> >>> >> NullPointerException at loadVertices(), ERROR bsp.BSPTask: > >> >> Started > >> >> >> >>> >> pinging to groom, ..., etc. > >> >> >> >>> >> > >> >> >> >>> >> Has anyone tried this? > >> >> >> >>> >> > >> >> >> >>> >> -- > >> >> >> >>> >> Best Regards, Edward J. Yoon > >> >> >> >>> >> @eddieyoon > >> >> >> >>> >> > >> >> >> >>> > >> >> >> >>> > >> >> >> >>> > >> >> >> >>> -- > >> >> >> >>> Best Regards, Edward J. Yoon > >> >> >> >>> @eddieyoon > >> >> >> >>> > >> >> >> > > >> >> >> > > >> >> >> > > >> >> >> > -- > >> >> >> > Best Regards, Edward J. Yoon > >> >> >> > @eddieyoon > >> >> >> > >> >> >> > >> >> >> > >> >> >> -- > >> >> >> Best Regards, Edward J. Yoon > >> >> >> @eddieyoon > >> >> >> > >> >> > >> >> > >> >> > >> >> -- > >> >> Best Regards, Edward J. Yoon > >> >> @eddieyoon > >> >> > >> > >> > >> > >> -- > >> Best Regards, Edward J. Yoon > >> @eddieyoon > >> > > > > -- > Best Regards, Edward J. Yoon > @eddieyoon >
