I'm new to Hadoop and I'm getting the following exception when I try to run my job on Hadoop cluster:

org.apache.hadoop.util.DiskChecker$DiskErrorException: Could not find any valid local directory for jobcache/job_201409031055_3865/jars/job.jar at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:376) at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:146) at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:127) at org.apache.hadoop.mapred.JobLocalizer.localizeJobJarFile(JobLocalizer.java:268) at org.apache.hadoop.mapred.JobLocalizer.localizeJobFiles(JobLocalizer.java:380) at org.apache.hadoop.mapred.JobLocalizer.localizeJobFiles(JobLocalizer.java:370) at org.apache.hadoop.mapred.DefaultTaskController.initializeJob(DefaultTaskController.java:232)
    at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1381)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java

Can anyone please tell me what seems to be the problem?

Best regards,
Marko

Reply via email to