I am running into few issues running nutch with distributed hadoop on 2
nodes:

Configuration:
2 nodes. One is master+slave, second node is just slave

I set mapred.map.tasks and mapred.reduce.tasks to 2.

Crawl works fine on single node (only one node acting as master+slave). When
I add second node to conf/slaves file, crawl fails with message:: Stopping
at depth=0 - no more URLs to fetch

Please help. I am also seeing log4j error ::
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /nutch/search/logs (Is a directory)
        at java.io.FileOutputStream.openAppend(Native Method)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:177)
        at java.io.FileOutputStream.<init>(FileOutputStream.java:102)
        at org.apache.log4j.FileAppender.setFile(FileAppender.java:289)


PLEASE HELP
-- 
View this message in context: 
http://www.nabble.com/Problems-running-multiple-nutch-nodes-tf4512336.html#a12870142
Sent from the Nutch - User mailing list archive at Nabble.com.

Reply via email to