Hello,

I've set up Hadoop on two machine and would like to test it with a simple test job. The setup/program works with single-node setup but not with the distributed environment. I get the following error when I run the simple org.myorg.WordCount program:

bin/hadoop jar examples/wordcount.jar org.myorg.WordCount input2 output22

13/03/13 17:40:56 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 13/03/13 17:40:56 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 13/03/13 17:40:56 INFO input.FileInputFormat: Total input paths to process : 3 13/03/13 17:40:56 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/03/13 17:40:56 WARN snappy.LoadSnappy: Snappy native library not loaded
13/03/13 17:40:56 INFO mapred.JobClient: Running job: job_201303131649_0008
13/03/13 17:40:57 INFO mapred.JobClient:  map 0% reduce 0%
13/03/13 17:41:07 INFO mapred.JobClient: Task Id : attempt_201303131649_0008_m_000004_0, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201303131649_0008_m_000004_0: execvp: Permission denied

I'm not sure where the "permission denied" is actually caused. Do you have any hints? My user can access the HDFS formated space.

Thanks,
Heinz

Attachment: smime.p7s
Description: S/MIME Cryptographic Signature

Reply via email to