Hi guys,

Could you send a stack trace of the process? Have you tried using a profiler
to check where the memory was used?
Check http://hadoop.apache.org/common/docs/current/mapred_tutorial.html for
instructions on how to profile with Hadoop in (pseudo) distributed mode.

Julien
-- 
DigitalPebble Ltd
http://www.digitalpebble.com

2009/11/8 Fadzi Ushewokunze <fa...@butterflycluster.net>

> i have a similar issue; i havent been able to get to the bottom of it.
>
> On Sat, 2009-11-07 at 23:31 -0500, kevin chen wrote:
> > Hi, I have using a trunk version of nutch since Jul 2007. It's being
> > running fine since.
> >
> > Recently I am experimenting with nutch 1.0. Everything worked great and
> > better until I start to use MergeSegments.  I was merging segments with
> > around 20k urls and it gave me OutOfMemoryError. I have tried to
> > increase the java heap max to 3G, I still got OutOfMemoryError.  In
> > contrast, in my older version of nutch,  same merge works with the
> > default java heap max setting of only 1G.
> >
> > Dose anybody have the same experience? Is there any work around this?
> >
> > Thanks
> > Kevin Chen
> >
>
>

Reply via email to