Do you have a stack trace?
Is it may related to a 'too many file open Exception?'.
Also you can try to minimalize 'io.sort.mb' and or io.sort.factor.

Stefan

Am 26.12.2005 um 09:27 schrieb K.A.Hussain Ali:

HI all,

I am using Nutch to crawl few sites and when i crawl for certain depth and do updation of webdb

while updating the webdb i get an "Out of Memory error"

I increased the jvm size using java_opts and even reduced the token size of per page in the nutch-default.xml but still i get such an error.

I am using tomcat and i have only one application running on it.

what is the system requirement of Nutch to get rid of this error ?

I even tried things mentioned in the mailing list but nothing turns to be fruitful.

Any help is greatly appreciated.
Thanks in advance

regards
-Hussain.

---------------------------------------------------------------
company:        http://www.media-style.com
forum:        http://www.text-mining.org
blog:            http://www.find23.net


Reply via email to