Hi guys, I'm having a problem running map reduce jobs in hadoop. Whenever I try to run a map reduce job, I get the following exception:
Caused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Permission denied: user=slave, access=EXECUTE, inode="map_red":master:supergroup:rwx------ at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:199) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:155) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:125) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4811) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkTraverse(FSNamesystem.java:4790) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:1876) at org.apache.hadoop.hdfs.server.namenode.NameNode.getFileInfo(NameNode.java:747) Note: this is not the entire stack trace. I found that this occurred because of the directory "/map_red" inside HDFS does not have necessary permissions. I use the command "hadoop fs -chmod -R" to change the permissions of /map_red to 777 after which the map reduce job works properly. Next time a map reduce job runs, the permissions are reverted back to rwx------, i need to do the same thing again to get it working. Is there any way to ensure that the permissions of /map_red always remain as rwxrwxrwx? I've included the necessary contents of mapred-site.xml are this: ... ... <property> <name>mapred.temp.dir</name> <value>/map_red_temp</value> </property> <property> <name>mapred.system.dir</name> <value>/map_red</value> </property> ... ...