hi,

you have to check hadoop.log to have more details about this error....this 
error message is like general ! i just said before that nutch error messages 
are not well designed...
i had this error before when i added a plugin but after running ant i didnt 
find the plugin folder in the build folder...

thx


> Date: Wed, 14 Oct 2009 22:28:48 -0700
> From: mehalaki...@gmail.com
> To: nutch-user@lucene.apache.org
> Subject: NUTCH_CRAWLING
> 
> 
> Hai, 
> 
> bin/nutch crawl urls -dir crawl_NEW1 -depth 3 -topN 50 
> 
> I have used the above command to crawl. 
> 
> I am getting the following error. 
> 
> Dedup: adding indexes in: crawl_NEW1/indexes 
> Exception in thread "main" java.io.IOException: Job failed! 
>         at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:604) 
>         at
> org.apache.nutch.indexer.DeleteDuplicates.dedup(DeleteDuplicates.java 
> :439) 
>         at org.apache.nutch.crawl.Crawl.main(Crawl.java:135) 
> 
> 
> can anyone help me to resolve this problem. 
> 
> Thank you in advance. 
> 
> -- 
> View this message in context: 
> http://www.nabble.com/NUTCH_CRAWLING-tp25903220p25903220.html
> Sent from the Nutch - User mailing list archive at Nabble.com.
> 
                                          
_________________________________________________________________
New! Faster Messenger access on the new MSN homepage
http://go.microsoft.com/?linkid=9677406

Reply via email to