Trystan Leftwich created SPARK-11248:
----------------------------------------

             Summary: Spark hivethriftserver is using the wrong user to while 
getting HDFS permissions
                 Key: SPARK-11248
                 URL: https://issues.apache.org/jira/browse/SPARK-11248
             Project: Spark
          Issue Type: Bug
    Affects Versions: 1.5.1, 1.5.0
            Reporter: Trystan Leftwich


While running spark as a hivethrift-server via Yarn Spark will use the user 
running the Hivethrift server rather than the user connecting via JDBC to check 
HDFS perms.

i.e.

In HDFS the perms are
rwx------   3 testuser testuser /user/testuser/table/testtable

And i connect via beeline as user testuser
beeline -u 'jdbc:hive2://localhost:10511' -n 'testuser' -p ''


If i try to hit that table

select count(*) from test_table;

I get the following error

Error: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table 
test_table. java.security.AccessControlException: Permission denied: user=hive, 
access=READ, inode="/user/testuser/table/testtable":testuser:testuser:drwxr-x--x
        at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
        at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
        at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:185)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6795)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6777)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6702)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:9529)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:1516)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1433)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:962)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2039)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2035)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2033) 
(state=,code=0)


I have the following in set in hive-site.xml so it should be using the correct 
user.

<property>
      <name>hive.server2.enable.doAs</name>
      <value>true</value>
    </property>

    <property>
      <name>hive.metastore.execute.setugi</name>
      <value>true</value>
    </property>
    
This works correctly in hive.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to