can you package 1.4 with updated hadoop jars? i have problems with running 
nutch in local mode. If i
run multiple tasks at once, they delete each other temporary files. Its worth a 
try if newer hadoop
libs will fix that.

Hi Radim,

I don't know whether current versions of hadoop fix this problem.
It's a bitter experience made by many Nutch users that running tasks
simultaneously in local mode may fail accidentally from time to time,
see http://www.mail-archive.com/[email protected]/msg03239.html

Set hadoop.tmp.dir for each job so that it points to an unique directory.
Instead of setting it in the xml config file which is uncomfortable
you can set it via java command line options or the NUTCH_OPTS environment
variable:
 NUTCH_OPTS=-Dhadoop.tmp.dir=/tmp/uniq_dir $NUTCH_HOME/bin/nutch ...

Sebastian

Reply via email to