[jira] [Commented] (RANGER-2976) User can not create external table in Hive Plugin

2022-11-29 Thread Sugumar Srinivasan (Jira)


[ 
https://issues.apache.org/jira/browse/RANGER-2976?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17640696#comment-17640696
 ] 

Sugumar Srinivasan commented on RANGER-2976:


Hi All, 

Even I'm also facing the similar issue. Do we have any fix for this? 

Version Details are below:
 # Apache Hadoop - 3.3.4
 # Apache Hive - 3.1.3
 # Apache Ranger - 2.0.0

Thanks & Regards, 

Sugumar Srinivasan. 

> User can not create external table in Hive Plugin
> -
>
> Key: RANGER-2976
> URL: https://issues.apache.org/jira/browse/RANGER-2976
> Project: Ranger
>  Issue Type: Bug
>  Components: plugins
>Affects Versions: 2.0.0
>Reporter: Janus Chow
>Priority: Major
> Attachments: RANGER-2976.patch
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> A user "userA" want's to create an external table on "hdfs://test/testDir" 
> via Hive Metastore installed Ranger Hive plugin. Permission information is as 
> follows.
> {code:java}
> # id userA
> uid=3044(userA) gid=3044(userA) groups=992(supergroup)
> # hadoop fs -ls hdfs://test
> drwxrwxr-x   - userB supergroup          0 2019-01-01 00:00 
> hdfs://test/testDir
> # hadoop fs -ls hdfs://test/testDir
> -rw-rw-r--   3 userB supergroup    100 2019-01-01 00:00 
> hdfs://test/testDir/part-0-db98bf17-bda6-4da9-9ea4-d7c75e8d995e-c000.snappy.parquet{code}
> When "userA" is trying to create an external table on "hdfs://test/testDir" 
> with the following command, 
> {code:java}
> spark.sql("create table userA_test USING org.apache.spark.sql.parquet OPTIONS 
> ( path = 'hdfs://test/testDir')")
> {code}
> Ranger denied the operation with the following error message.
> {code:java}
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:Permission denied: user [userA] does not have [ALL] 
> privilege on [hdfs://test/testDir])
> {code}
> The reason is when Ranger is checking URI permission, it will check if the 
> user has FSAction.ALL on the URI if "userA" is not the owner of the HDFS 
> path, but HDFS file will not set the execution permission by default, so the 
> Ranger permission check will return false.
> I think in the getURIAccessType function in RangerHiveAuthorizer, we should 
> return FSAction.READ_WRITE instead of FSAction.ALL. For HDFS directory, 
> Hadoop will help us to add FSAction.EXECUTE when we are trying to do the 
> permission check, we can skip FSAction.EXECUTE here to work well with HDFS 
> files. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (RANGER-2976) User can not create external table in Hive Plugin

2020-08-31 Thread Ramesh Mani (Jira)


[ 
https://issues.apache.org/jira/browse/RANGER-2976?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17187452#comment-17187452
 ] 

Ramesh Mani commented on RANGER-2976:
-

[~Symious] even for the files under, hdfs://test/testdir use URL Policy like 
"hdfs://test/testdir/*". Without this check any user can point to this location 
and read create a table, which is not desirable.  One more option is to  
enabled doAs=True for Hive.

> User can not create external table in Hive Plugin
> -
>
> Key: RANGER-2976
> URL: https://issues.apache.org/jira/browse/RANGER-2976
> Project: Ranger
>  Issue Type: Bug
>  Components: plugins
>Affects Versions: 2.0.0
>Reporter: Janus Chow
>Priority: Major
> Attachments: RANGER-2976.patch
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> A user "userA" want's to create an external table on "hdfs://test/testDir" 
> via Hive Metastore installed Ranger Hive plugin. Permission information is as 
> follows.
> {code:java}
> # id userA
> uid=3044(userA) gid=3044(userA) groups=992(supergroup)
> # hadoop fs -ls hdfs://test
> drwxrwxr-x   - userB supergroup          0 2019-01-01 00:00 
> hdfs://test/testDir
> # hadoop fs -ls hdfs://test/testDir
> -rw-rw-r--   3 userB supergroup    100 2019-01-01 00:00 
> hdfs://test/testDir/part-0-db98bf17-bda6-4da9-9ea4-d7c75e8d995e-c000.snappy.parquet{code}
> When "userA" is trying to create an external table on "hdfs://test/testDir" 
> with the following command, 
> {code:java}
> spark.sql("create table userA_test USING org.apache.spark.sql.parquet OPTIONS 
> ( path = 'hdfs://test/testDir')")
> {code}
> Ranger denied the operation with the following error message.
> {code:java}
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:Permission denied: user [userA] does not have [ALL] 
> privilege on [hdfs://test/testDir])
> {code}
> The reason is when Ranger is checking URI permission, it will check if the 
> user has FSAction.ALL on the URI if "userA" is not the owner of the HDFS 
> path, but HDFS file will not set the execution permission by default, so the 
> Ranger permission check will return false.
> I think in the getURIAccessType function in RangerHiveAuthorizer, we should 
> return FSAction.READ_WRITE instead of FSAction.ALL. For HDFS directory, 
> Hadoop will help us to add FSAction.EXECUTE when we are trying to do the 
> permission check, we can skip FSAction.EXECUTE here to work well with HDFS 
> files. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (RANGER-2976) User can not create external table in Hive Plugin

2020-08-30 Thread Janus Chow (Jira)


[ 
https://issues.apache.org/jira/browse/RANGER-2976?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17187442#comment-17187442
 ] 

Janus Chow commented on RANGER-2976:


[~rmani], the URL policy is ok for "hdfs://test/testdir", since "userA" is in 
the Group of superGroup, so userA does have "FsAction.ALL" on 
"hdfs://test/testdir".

The problem happens when Ranger is trying to check the permission of files 
under "hdfs://test/testdir", since usually the permission of "FsAction.EXECUTE" 
is not specified to files, so the permission check would fail.

> User can not create external table in Hive Plugin
> -
>
> Key: RANGER-2976
> URL: https://issues.apache.org/jira/browse/RANGER-2976
> Project: Ranger
>  Issue Type: Bug
>  Components: plugins
>Affects Versions: 2.0.0
>Reporter: Janus Chow
>Priority: Major
> Attachments: RANGER-2976.patch
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> A user "userA" want's to create an external table on "hdfs://test/testDir" 
> via Hive Metastore installed Ranger Hive plugin. Permission information is as 
> follows.
> {code:java}
> # id userA
> uid=3044(userA) gid=3044(userA) groups=992(supergroup)
> # hadoop fs -ls hdfs://test
> drwxrwxr-x   - userB supergroup          0 2019-01-01 00:00 
> hdfs://test/testDir
> # hadoop fs -ls hdfs://test/testDir
> -rw-rw-r--   3 userB supergroup    100 2019-01-01 00:00 
> hdfs://test/testDir/part-0-db98bf17-bda6-4da9-9ea4-d7c75e8d995e-c000.snappy.parquet{code}
> When "userA" is trying to create an external table on "hdfs://test/testDir" 
> with the following command, 
> {code:java}
> spark.sql("create table userA_test USING org.apache.spark.sql.parquet OPTIONS 
> ( path = 'hdfs://test/testDir')")
> {code}
> Ranger denied the operation with the following error message.
> {code:java}
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:Permission denied: user [userA] does not have [ALL] 
> privilege on [hdfs://test/testDir])
> {code}
> The reason is when Ranger is checking URI permission, it will check if the 
> user has FSAction.ALL on the URI if "userA" is not the owner of the HDFS 
> path, but HDFS file will not set the execution permission by default, so the 
> Ranger permission check will return false.
> I think in the getURIAccessType function in RangerHiveAuthorizer, we should 
> return FSAction.READ_WRITE instead of FSAction.ALL. For HDFS directory, 
> Hadoop will help us to add FSAction.EXECUTE when we are trying to do the 
> permission check, we can skip FSAction.EXECUTE here to work well with HDFS 
> files. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (RANGER-2976) User can not create external table in Hive Plugin

2020-08-30 Thread Ramesh Mani (Jira)


[ 
https://issues.apache.org/jira/browse/RANGER-2976?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17187437#comment-17187437
 ] 

Ramesh Mani commented on RANGER-2976:
-

[~Symious] URL authorizations are done in Ranger Hive Plugin by Hive URL 
polices.  Please maintain the policies for "hdfs://test/testdir". Also you need 
to have the following param in ranger-hive-security.xml for enabling it.

ranger.plugin.hive.urlauth.filesystem.schemes=[file:|file:///,wasb:,wasbs:,adl:]

 

> User can not create external table in Hive Plugin
> -
>
> Key: RANGER-2976
> URL: https://issues.apache.org/jira/browse/RANGER-2976
> Project: Ranger
>  Issue Type: Bug
>  Components: plugins
>Affects Versions: 2.0.0
>Reporter: Janus Chow
>Priority: Major
> Attachments: RANGER-2976.patch
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> A user "userA" want's to create an external table on "hdfs://test/testDir" 
> via Hive Metastore installed Ranger Hive plugin. Permission information is as 
> follows.
> {code:java}
> # id userA
> uid=3044(userA) gid=3044(userA) groups=992(supergroup)
> # hadoop fs -ls hdfs://test
> drwxrwxr-x   - userB supergroup          0 2019-01-01 00:00 
> hdfs://test/testDir
> # hadoop fs -ls hdfs://test/testDir
> -rw-rw-r--   3 userB supergroup    100 2019-01-01 00:00 
> hdfs://test/testDir/part-0-db98bf17-bda6-4da9-9ea4-d7c75e8d995e-c000.snappy.parquet{code}
> When "userA" is trying to create an external table on "hdfs://test/testDir" 
> with the following command, 
> {code:java}
> spark.sql("create table userA_test USING org.apache.spark.sql.parquet OPTIONS 
> ( path = 'hdfs://test/testDir')")
> {code}
> Ranger denied the operation with the following error message.
> {code:java}
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:Permission denied: user [userA] does not have [ALL] 
> privilege on [hdfs://test/testDir])
> {code}
> The reason is when Ranger is checking URI permission, it will check if the 
> user has FSAction.ALL on the URI if "userA" is not the owner of the HDFS 
> path, but HDFS file will not set the execution permission by default, so the 
> Ranger permission check will return false.
> I think in the getURIAccessType function in RangerHiveAuthorizer, we should 
> return FSAction.READ_WRITE instead of FSAction.ALL. For HDFS directory, 
> Hadoop will help us to add FSAction.EXECUTE when we are trying to do the 
> permission check, we can skip FSAction.EXECUTE here to work well with HDFS 
> files. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (RANGER-2976) User can not create external table in Hive Plugin

2020-08-30 Thread Janus Chow (Jira)


[ 
https://issues.apache.org/jira/browse/RANGER-2976?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17187425#comment-17187425
 ] 

Janus Chow commented on RANGER-2976:


[~pradeep] Have created and linked the Pull Request, please have a check.

> User can not create external table in Hive Plugin
> -
>
> Key: RANGER-2976
> URL: https://issues.apache.org/jira/browse/RANGER-2976
> Project: Ranger
>  Issue Type: Bug
>  Components: plugins
>Affects Versions: 2.0.0
>Reporter: Janus Chow
>Priority: Major
> Attachments: RANGER-2976.patch
>
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> A user "userA" want's to create an external table on "hdfs://test/testDir" 
> via Hive Metastore installed Ranger Hive plugin. Permission information is as 
> follows.
> {code:java}
> # id userA
> uid=3044(userA) gid=3044(userA) groups=992(supergroup)
> # hadoop fs -ls hdfs://test
> drwxrwxr-x   - userB supergroup          0 2019-01-01 00:00 
> hdfs://test/testDir
> # hadoop fs -ls hdfs://test/testDir
> -rw-rw-r--   3 userB supergroup    100 2019-01-01 00:00 
> hdfs://test/testDir/part-0-db98bf17-bda6-4da9-9ea4-d7c75e8d995e-c000.snappy.parquet{code}
> When "userA" is trying to create an external table on "hdfs://test/testDir" 
> with the following command, 
> {code:java}
> spark.sql("create table userA_test USING org.apache.spark.sql.parquet OPTIONS 
> ( path = 'hdfs://test/testDir')")
> {code}
> Ranger denied the operation with the following error message.
> {code:java}
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:Permission denied: user [userA] does not have [ALL] 
> privilege on [hdfs://test/testDir])
> {code}
> The reason is when Ranger is checking URI permission, it will check if the 
> user has FSAction.ALL on the URI if "userA" is not the owner of the HDFS 
> path, but HDFS file will not set the execution permission by default, so the 
> Ranger permission check will return false.
> I think in the getURIAccessType function in RangerHiveAuthorizer, we should 
> return FSAction.READ_WRITE instead of FSAction.ALL. For HDFS directory, 
> Hadoop will help us to add FSAction.EXECUTE when we are trying to do the 
> permission check, we can skip FSAction.EXECUTE here to work well with HDFS 
> files. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (RANGER-2976) User can not create external table in Hive Plugin

2020-08-30 Thread Pradeep Agrawal (Jira)


[ 
https://issues.apache.org/jira/browse/RANGER-2976?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17187420#comment-17187420
 ] 

Pradeep Agrawal commented on RANGER-2976:
-

[~Symious]: If would able to let you know only after testing this patch. It may 
take time to get the env. and test.

> User can not create external table in Hive Plugin
> -
>
> Key: RANGER-2976
> URL: https://issues.apache.org/jira/browse/RANGER-2976
> Project: Ranger
>  Issue Type: Bug
>  Components: plugins
>Affects Versions: 2.0.0
>Reporter: Janus Chow
>Priority: Major
> Attachments: RANGER-2976.patch
>
>
> A user "userA" want's to create an external table on "hdfs://test/testDir" 
> via Hive Metastore installed Ranger Hive plugin. Permission information is as 
> follows.
> {code:java}
> # id userA
> uid=3044(userA) gid=3044(userA) groups=992(supergroup)
> # hadoop fs -ls hdfs://test
> drwxrwxr-x   - userB supergroup          0 2019-01-01 00:00 
> hdfs://test/testDir
> # hadoop fs -ls hdfs://test/testDir
> -rw-rw-r--   3 userB supergroup    100 2019-01-01 00:00 
> hdfs://test/testDir/part-0-db98bf17-bda6-4da9-9ea4-d7c75e8d995e-c000.snappy.parquet{code}
> When "userA" is trying to create an external table on "hdfs://test/testDir" 
> with the following command, 
> {code:java}
> spark.sql("create table userA_test USING org.apache.spark.sql.parquet OPTIONS 
> ( path = 'hdfs://test/testDir')")
> {code}
> Ranger denied the operation with the following error message.
> {code:java}
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:Permission denied: user [userA] does not have [ALL] 
> privilege on [hdfs://test/testDir])
> {code}
> The reason is when Ranger is checking URI permission, it will check if the 
> user has FSAction.ALL on the URI if "userA" is not the owner of the HDFS 
> path, but HDFS file will not set the execution permission by default, so the 
> Ranger permission check will return false.
> I think in the getURIAccessType function in RangerHiveAuthorizer, we should 
> return FSAction.READ_WRITE instead of FSAction.ALL. For HDFS directory, 
> Hadoop will help us to add FSAction.EXECUTE when we are trying to do the 
> permission check, we can skip FSAction.EXECUTE here to work well with HDFS 
> files. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (RANGER-2976) User can not create external table in Hive Plugin

2020-08-29 Thread Janus Chow (Jira)


[ 
https://issues.apache.org/jira/browse/RANGER-2976?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17187137#comment-17187137
 ] 

Janus Chow commented on RANGER-2976:


Hi, [~pradeep], Can you have a look at this patch?

> User can not create external table in Hive Plugin
> -
>
> Key: RANGER-2976
> URL: https://issues.apache.org/jira/browse/RANGER-2976
> Project: Ranger
>  Issue Type: Bug
>  Components: plugins
>Affects Versions: 2.0.0
>Reporter: Janus Chow
>Priority: Major
> Attachments: RANGER-2976.patch
>
>
> A user "userA" want's to create an external table on "hdfs://test/testDir" 
> via Hive Metastore installed Ranger Hive plugin. Permission information is as 
> follows.
> {code:java}
> # id userA
> uid=3044(userA) gid=3044(userA) groups=992(supergroup)
> # hadoop fs -ls hdfs://test
> drwxrwxr-x   - userB supergroup          0 2019-01-01 00:00 
> hdfs://test/testDir
> # hadoop fs -ls hdfs://test/testDir
> -rw-rw-r--   3 userB supergroup    100 2019-01-01 00:00 
> hdfs://test/testDir/part-0-db98bf17-bda6-4da9-9ea4-d7c75e8d995e-c000.snappy.parquet{code}
> When "userA" is trying to create an external table on "hdfs://test/testDir" 
> with the following command, 
> {code:java}
> spark.sql("create table userA_test USING org.apache.spark.sql.parquet OPTIONS 
> ( path = 'hdfs://test/testDir')")
> {code}
> Ranger denied the operation with the following error message.
> {code:java}
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:Permission denied: user [userA] does not have [ALL] 
> privilege on [hdfs://test/testDir])
> {code}
> The reason is when Ranger is checking URI permission, it will check if the 
> user has FSAction.ALL on the URI if "userA" is not the owner of the HDFS 
> path, but HDFS file will not set the execution permission by default, so the 
> Ranger permission check will return false.
> I think in the getURIAccessType function in RangerHiveAuthorizer, we should 
> return FSAction.READ_WRITE instead of FSAction.ALL. For HDFS directory, 
> Hadoop will help us to add FSAction.EXECUTE when we are trying to do the 
> permission check, we can skip FSAction.EXECUTE here to work well with HDFS 
> files. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)