Should i change the value of   'io.sort.mb' and or io.sort.factor ?
and if so what should i change to so to eliminate the  error?
Yes, since it looks like it crah until sorting.

Also is there any minimum requirement of RAM for nutch to do indexing and searching ?

Well, not really but you should have 1 GB RAM if you want to do serious things.
You can setup the memory:
from the bin/nutch script:
#   NUTCH_HEAPSIZE  The maximum amount of heap to use, in MB.
#                   Default is 1000.

...
JAVA_HEAP_MAX=-Xmx1000m

HTH
Stefan
Any help is greatly appreciated
Thanks in advance

regards
-Hussain.



----- Original Message ----- From: "Stefan Groschupf" <[EMAIL PROTECTED] style.com>
To: <[email protected]>
Sent: Monday, December 26, 2005 7:18 PM
Subject: Re: "Out of memor error" while updating


Do you have a stack trace?
Is it may related to a 'too many file open Exception?'.
Also you can try to minimalize 'io.sort.mb' and or io.sort.factor.

Stefan

Am 26.12.2005 um 09:27 schrieb K.A.Hussain Ali:

HI all,

I am using Nutch to crawl few sites and when i crawl for certain
depth and do updation of webdb

while updating the webdb i get an "Out of Memory error"

I increased the jvm size using java_opts and even reduced the token
size of per page in the nutch-default.xml but still i get such an
error.

I am using tomcat and i have only one application running on it.

what is the system requirement of Nutch to get rid of this error ?

I even tried things mentioned in the mailing list but nothing turns
to be fruitful.

Any help is greatly appreciated.
Thanks in advance

regards
-Hussain.

---------------------------------------------------------------
company:        http://www.media-style.com
forum:        http://www.text-mining.org
blog:            http://www.find23.net





Reply via email to