(can't access work email, so posting via this account!)

I've tried absolutely everything to resolve this issue, and have scoured
the web over the weekend in an attempt to rectify this issue.

Can no-one help?!


Previous email:

Hi all,

I've seen this on the mailing list archives but not a solution.

When I perform:
../bin/nutch inject /opt/nutch/filesystem/data/crawl/crawldb/
/opt/nutch/filesystem/data/seed/

I'm getting:
Injector: org.apache.hadoop.mapred.InvalidInputException: Input path
does not exist:
hdfs://localhost:9000/opt/nutch/filesystem/temp/inject-temp-496643776
     at
org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:179)
     at
org.apache.hadoop.mapred.SequenceFileInputFormat.listStatus(SequenceFileInputFormat.java:39)
     at
org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:190)
     at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:797)
     at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1142)
     at org.apache.nutch.crawl.Injector.inject(Injector.java:220)
     at org.apache.nutch.crawl.Injector.run(Injector.java:241)
     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
     at org.apache.nutch.crawl.Injector.main(Injector.java:231)

The temp directory is a HDFS directory, and exists. I have plenty of
disk space left.

Anyone know the cause? Is it a permission thing?

Thanks in advance,

Dean.

Reply via email to