@nutch.apache.org
Gönderilenler: Tue, 22 Oct 2013 02:47:20 +0300 (EEST)
Konu: [Nutch 2.2.1] Error java.lang.OutOfMemoryError: GC overhead limit exceeded
Hi,
I saw this strange out of memory error today that said GC overhead limit
exceeded -* Error java.lang.OutOfMemoryError: GC overhead limit exceeded
Hi,
I saw this strange out of memory error today that said GC overhead limit
exceeded -* Error java.lang.OutOfMemoryError: GC overhead limit exceeded *in
parsing stage. I have seen OOM errors; heap space before but never that
said GC overhead limit exceeded
Below is the log. Can anyone please
Hello,
I am getting the same error and here is the log
2012-08-11 13:33:08,223 ERROR http.Http - Failed with the following error:
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2271)
at
I was able to do jstack just before the program exited. The output is attached.
-Original Message-
From: alxsss alx...@aim.com
To: user user@nutch.apache.org
Sent: Sat, Aug 11, 2012 2:17 pm
Subject: Re: java.lang.OutOfMemoryError: GC overhead limit exceeded
Hello,
I am
Hi,
Of course setting a bigger heap sure helps, but most of the time only
temporary. Can you see in the logs what type of documents are parsed?
In case of html documents crawled on the wild web, a single document can
cause the heap to explode. By default the cyberneko parser (in HtmlParser)
is
Hi Ferdy,
When you get the Out of memory error if you have these opzions on the JVM:
-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/var/tmp
You get file on your filesystem with a heap dump at the instant of the
problem.
You can use http://www.eclipse.org/mat/ (an eclipse's extension) that is a
It was crawling HTML files when it started throwing the exception.
Unfortunately, I didn't keep copies of the files or urls.
On Thu, Aug 9, 2012 at 3:07 AM, Ferdy Galema ferdy.gal...@kalooga.comwrote:
Hi,
Of course setting a bigger heap sure helps, but most of the time only
temporary. Can
Is this something other people are seeing? I was parsing 10k urls when I
got this exception. I'm running Nutch 2 head as of Aug 6 with the default
memory settings(1 GB).
Just wondering if anybody else has experienced this on Nutch 2.
Thanks.
If you are using Nutch in an hadoop cluster and you have enough memory try
with this parameters:
property
namemapred.child.java.opts/name
value-Xmx1600m -XX:-UseGCOverheadLimit
-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/var/tmp/value
/property
On Wed, Aug 8, 2012 at 9:32 PM, Bai
9 matches
Mail list logo