That would make sense, but I am pretty sure this is not the issue. In this
config, I am running with 1024mb of memory.  I kind of thought that nutch
was able to run on this amount of memory?  It would just take much longer. 

I tried to run the same crawl using the SMB plugin on a Linux machine with
8GB of memory.  Of course it ran longer, but in the end, I got the same
error.  I have turned on various levels of logging and debugging, and I have
had no luck figuring out what might be causing it.  

-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/Have-yet-to-complete-a-very-large-filesystem-crawl-tp1076547p1085270.html
Sent from the Nutch - User mailing list archive at Nabble.com.

Reply via email to