[
https://issues.apache.org/jira/browse/RANGER-3083?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17773957#comment-17773957
]
Kunal commented on RANGER-3083:
-------------------------------
DenyAllElse boolean behaving differently in Ranger HDFS Policies for Users with
READ/EXECUTE Access , when we access Hive tables using hiveserver2
Reproducer Steps :
# Set hive doAS true
# Create Hive database with location
create database testdb LOCATION '/data/staging/testdb'
# Create HIVE RANGER POLICY :
Database : testdb
Table : *
Columns: *
User1 -> Full Access
User2 , User3 -> Select, Read
Deny All Other Accesses : True
# Create HDFS Ranger Policy:
Resource : /data/staging/testdb
User1 -> Read , Write , Execute
User2 , User3 -> Read , Execute
Deny All Other Accesses : True
Unexpected Behaviour:
If User2 or User3 runs a SELECT query via hive on any table under testdb, the
Query fails as it expects WRITE access on '/data/staging/testdb' DB location
But with, "Deny All Other Accesses" set to True we are able to READ the files
under the location from HDFS.
If we disable "Deny All Other Accesses" , we dont see WRITE access request and
the SELECT query succeeds
ERROR snippet : with "hive --verbose" option
Caused by: org.apache.hadoop.hive.ql.parse.SemanticException: Unable to
determine if hdfs://nn1/data/staging/testdb/test_table is read only:
java.lang.reflect.InvocationTargetException
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2083)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:12074)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:12170)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11716)
at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:286)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:659)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1826)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1773)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1768)
at
org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:126)
at
org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:198)
... 26 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to
determine if hdfs://nn1/data/staging/testdb/test_table is read only:
java.lang.reflect.InvocationTargetException
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.isPathReadOnly(SemanticAnalyzer.java:2508)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getStagingDirectoryPathname(SemanticAnalyzer.java:2565)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2376)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:2075)
... 36 more
Caused by: java.lang.RuntimeException:
java.lang.reflect.InvocationTargetException:null
at sun.reflect.GeneratedMethodAccessor582.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.hadoop.hive.shims.Hadoop23Shims.checkFileAccess(Hadoop23Shims.java:868)
at
org.apache.hadoop.hive.common.FileUtils.checkFileAccessWithImpersonation(FileUtils.java:402)
at
org.apache.hadoop.hive.common.FileUtils.checkFileAccessWithImpersonation(FileUtils.java:370)
at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.isPathReadOnly(SemanticAnalyzer.java:2499)
... 39 more
Caused by: java.lang.RuntimeException:
org.apache.hadoop.ipc.RemoteException:Permission denied: user=user2,
access=WRITE, inode="/data/staging/testdb/test_table"
at
org.apache.ranger.authorization.hadoop.RangerHdfsAuthorizer$RangerAccessControlEnforcer.checkPermission(RangerHdfsAuthorizer.java:466)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1906)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1890)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1840)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8001)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2283)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1693)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:528)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1086)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1031)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:959)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2963)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1604)
at org.apache.hadoop.ipc.Client.call(Client.java:1550)
at org.apache.hadoop.ipc.Client.call(Client.java:1447)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
at com.sun.proxy.$Proxy33.checkAccess(Unknown Source)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.checkAccess(ClientNamenodeProtocolTranslatorPB.java:1740)
at sun.reflect.GeneratedMethodAccessor583.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
at com.sun.proxy.$Proxy34.checkAccess(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.checkAccess(DFSClient.java:2867)
at
org.apache.hadoop.hdfs.DistributedFileSystem$64.doCall(DistributedFileSystem.java:2862)
at
org.apache.hadoop.hdfs.DistributedFileSystem$64.doCall(DistributedFileSystem.java:2859)
at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at
org.apache.hadoop.hdfs.DistributedFileSystem.access(DistributedFileSystem.java:2872)
h4.
> denyAllElse policy does not handle access request with access-type of '_any'
> ----------------------------------------------------------------------------
>
> Key: RANGER-3083
> URL: https://issues.apache.org/jira/browse/RANGER-3083
> Project: Ranger
> Issue Type: Bug
> Components: Ranger
> Reporter: Abhay Kulkarni
> Assignee: Abhay Kulkarni
> Priority: Major
> Fix For: 3.0.0, 2.2.0
>
>
> If 'isDenyAllElse' flag is set in a Ranger policy, then the expected behavior
> is that access is denied to all users and/or for access types not explicitly
> allowed by the policy. However, such policy is not correctly evaluated if the
> access type is '_any' (which is used by Ranger internally to evaluate certain
> access operations).
--
This message was sent by Atlassian Jira
(v8.20.10#820010)