disk full?

2009/12/2 BELLINI ADAM <mbel...@msn.com>

>
> hi,
> i have this error when crawling....
>
> org.apache.hadoop.util.DiskChecker$DiskErrorException: Could not find any
> valid local directory for
> taskTracker/jobcache/job_local_0001/attempt_local_0001_m_000000_0/output/spill0.out
>        at
> org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:335)
>        at
> org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:124)
>        at
> org.apache.hadoop.mapred.MapOutputFile.getSpillFileForWrite(MapOutputFile.java:107)
>        at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:930)
>        at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:842)
>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>        at
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:138)
> 2009-12-01 19:02:25,778 FATAL crawl.Generator - Generator:
> java.io.IOException: Job failed!
>        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1232)
>        at org.apache.nutch.crawl.Generator.generate(Generator.java:472)
>        at org.apache.nutch.crawl.Generator.run(Generator.java:618)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at org.apache.nutch.crawl.Generator.main(Generator.java:581)
>
> i searched the mailing list and didnt fin an asnwer
>
> help
>
> _________________________________________________________________
> Windows Live: Keep your friends up to date with what you do online.
> http://go.microsoft.com/?linkid=9691815




-- 
DigitalPebble Ltd
http://www.digitalpebble.com

Reply via email to