Are you sure user 'Alex' belongs to 'hadoop' group? Why not your run command 
'id alex' to prove it? And 'Alex' belongs to 'hadoop' group can be confirmed on 
the namenode?
Yong

Date: Thu, 24 Jul 2014 17:11:06 +0800
Subject: issue about run MR job use system user
From: [email protected]
To: [email protected]

hi,maillist:
          i create a system user on a box of my hadoop cluster ,but when i run 
MR job use this user ,it get a problem, the /data directory is for mapreduce 
history server option, and i also add the user into hadoop group ,since the 
/data privilege is 775 ,so it can write by user in hadoop group,why still cause 
permssion error? anyone can help?

 

# useradd alex
 
<configuration>
        <property>
         <name>mapreduce.framework.name</name>
         <value>yarn</value>
        </property>

        <property>
         <name>mapreduce.jobhistory.address</name>
         <value>192.168.10.49:10020</value>
        </property>

        <property>
         <name>mapreduce.jobhistory.webapp.address</name>
         <value>192.168.10.49:19888</value>
        </property>

        <property>
         <name>yarn.app.mapreduce.am.staging-dir</name>
         <value>/data</value>
        </property>
.........
<configuration>

$ hadoop fs -ls /
Found 6 items
drwxrwxr-x   - hdfs  hadoop          0 2014-07-14 18:17 /data

 
$ hadoop fs -ls /data
Found 3 items
drwx------   - hdfs hadoop          0 2014-07-09 08:49 /data/hdfs
drwxrwxrwt   - hdfs hadoop          0 2014-07-08 18:52 /data/history
drwx------   - pipe hadoop          0 2014-07-14 18:17 /data/pipe

 
[alex@hz23 ~]$ id
uid=501(alex) gid=501(alex) groups=501(alex),497(hadoop)

 
 
[alex@hz23 ~]$  hadoop jar 
/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.3.0-cdh5.0.2.jar pi 2 100
Number of Maps  = 2
Samples per Map = 100
Wrote input for Map #0
Wrote input for Map #1
Starting Job

14/07/24 17:06:23 WARN security.UserGroupInformation: 
PriviledgedActionException as:alex (auth:SIMPLE) 
cause:org.apache.hadoop.security.AccessControlException: Permission denied: 
user=alex, access=WRITE, inode="/data":hdfs:hadoop:drwxrwxr-x

        at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
        at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)

        at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:232)
        at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:176)

        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5490)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5472)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:5446)

        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:3600)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:3570)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3544)

        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:739)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:558)

        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)

        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982)
        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1980)

org.apache.hadoop.security.AccessControlException: Permission denied: 
user=alex, access=WRITE, inode="/data":hdfs:hadoop:drwxrwxr-x
                                          

Reply via email to