Hi guys ! 

Common things done by me :
$chmod -R 777 hadoop_extract
$chmod -R 777 /app

@Joey
 I have created dfs dir /user/arun and made arun owner and tried as below
 1>
arun@arun-Presario-C500-RU914PA-ACJ:/$ /usr/local/hadoop/bin/hadoop jar
/usr/local/hadoop/hadoop-examples-0.20.203.0.jar wordcount
-Dmapred.job.queue.name=myqueue1 /user/arun/wcin /user/arun/wcout2
Exception in thread "main" java.io.IOException: Permission denied
        at java.io.UnixFileSystem.createFileExclusively(Native Method)
        at java.io.File.checkAndCreate(File.java:1704)
        at java.io.File.createTempFile(File.java:1792)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:115)

2>
arun@arun-Presario-C500-RU914PA-ACJ:/$ /usr/local/hadoop/bin/hadoop jar
/usr/local/hadoop/hadoop-examples-0.20.203.0.jar wordcount
-Dmapred.job.queue.name=myqueue1 /user/hduser/wcin /user/hduser/wcout2
Exception in thread "main" java.io.IOException: Permission denied
        at java.io.UnixFileSystem.createFileExclusively(Native Method)
        at java.io.File.checkAndCreate(File.java:1704)
        at java.io.File.createTempFile(File.java:1792)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:115)

@Uma

I have Set mapreduce.jobtracker.staging.root.dir propery value in
mapred-site.xml  to /user and restartd cluster but that doesn't  work.

@Aaron
I have set  the config value "dfs.permissions" to "false" in hdfs-site.xml
and restarted.
i get the same error above while running applicn.

Only thing left is : hadoop fs -chmod 777

Arun

 

--
View this message in context: 
http://lucene.472066.n3.nabble.com/Submitting-Jobs-from-different-user-to-a-queue-in-capacity-scheduler-tp3345752p3347813.html
Sent from the Hadoop lucene-users mailing list archive at Nabble.com.

Reply via email to