If you are using Nutch in an hadoop cluster and you have enough memory try
with this parameters:

<property>
    <name>mapred.child.java.opts</name>
    <value>-Xmx1600m -XX:-UseGCOverheadLimit
-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/var/tmp</value>
</property>

On Wed, Aug 8, 2012 at 9:32 PM, Bai Shen <baishen.li...@gmail.com> wrote:

> Is this something other people are seeing?  I was parsing 10k urls when I
> got this exception.  I'm running Nutch 2 head as of Aug 6 with the default
> memory settings(1 GB).
>
> Just wondering if anybody else has experienced this on Nutch 2.
>
> Thanks.
>

Reply via email to