[ 
https://issues.apache.org/jira/browse/HIVE-20001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16524295#comment-16524295
 ] 

ASF GitHub Bot commented on HIVE-20001:
---------------------------------------

GitHub user beltran opened a pull request:

    https://github.com/apache/hive/pull/380

    HIVE-20001: With doas set to true, running select query as hrt_qa use…

    …r on external table fails due to permission denied to read 
/warehouse/tablespace/managed directory

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/beltran/hive HIVE-20001

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/hive/pull/380.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #380
    
----
commit 3e9dd9a73ae9d33e2f291819b0e10e4296f2b568
Author: Jaume Marhuenda <jaumemarhuenda@...>
Date:   2018-06-26T22:42:14Z

    HIVE-20001: With doas set to true, running select query as hrt_qa user on 
external table fails due to permission denied to read 
/warehouse/tablespace/managed directory

----


> With doas set to true, running select query as hrt_qa user on external table 
> fails due to permission denied to read /warehouse/tablespace/managed 
> directory.
> ------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-20001
>                 URL: https://issues.apache.org/jira/browse/HIVE-20001
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Jaume M
>            Assignee: Jaume M
>            Priority: Major
>              Labels: pull-request-available
>         Attachments: HIVE-20001.1.patch
>
>
> Hive: With doas set to true, running select query as hrt_qa user on external 
> table fails due to permission denied to read /warehouse/tablespace/managed 
> directory.
> Steps: 
> 1. Create a external table.
> 2. Set doas to true.
> 3. run select count(*) using user hrt_qa.
> Table creation query.
> {code}
> beeline -n hrt_qa -p pwd -u 
> "jdbc:hive2://ctr-e138-1518143905142-375925-01-000006.hwx.site:2181,ctr-e138-1518143905142-375925-01-000005.hwx.site:2181,ctr-e138-1518143905142-375925-01-000007.hwx.site:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/_h...@example.com;transportMode=http;httpPath=cliservice;ssl=true;sslTrustStore=/etc/security/serverKeys/hivetruststore.jks;trustStorePassword=changeit"
>  --outputformat=tsv -e "drop table if exists test_table purge;
> create external table test_table(id int, age int) row format delimited fields 
> terminated by '|' stored as textfile;
> load data inpath '/tmp/table1.dat' overwrite into table test_table;
> {code}
> select count(*) query execution fails
> {code}
> beeline -n hrt_qa -p pwd -u 
> "jdbc:hive2://ctr-e138-1518143905142-375925-01-000006.hwx.site:2181,ctr-e138-1518143905142-375925-01-000005.hwx.site:2181,ctr-e138-1518143905142-375925-01-000007.hwx.site:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/_h...@example.com;transportMode=http;httpPath=cliservice;ssl=true;sslTrustStore=/etc/security/serverKeys/hivetruststore.jks;trustStorePassword=changeit"
>  --outputformat=tsv -e "select count(*) from test_table where age>30 and 
> id<10100;"
> 2018-06-22 10:22:29,328|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|SLF4J: Class path contains 
> multiple SLF4J bindings.
> 2018-06-22 10:22:29,330|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|SLF4J: See 
> http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> 2018-06-22 10:22:29,335|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|SLF4J: Actual binding is of 
> type [org.apache.logging.slf4j.Log4jLoggerFactory]
> 2018-06-22 10:22:31,408|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|Format tsv is deprecated, 
> please use tsv2
> 2018-06-22 10:22:31,529|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|Connecting to 
> jdbc:hive2://ctr-e138-1518143905142-375925-01-000006.hwx.site:2181,ctr-e138-1518143905142-375925-01-000005.hwx.site:2181,ctr-e138-1518143905142-375925-01-000007.hwx.site:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/_h...@example.com;transportMode=http;httpPath=cliservice;ssl=true;sslTrustStore=/etc/security/serverKeys/hivetruststore.jks;trustStorePassword=changeit
> 2018-06-22 10:22:32,031|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|18/06/22 10:22:32 [main]: 
> INFO jdbc.HiveConnection: Connected to 
> ctr-e138-1518143905142-375925-01-000004.hwx.site:10001
> 2018-06-22 10:22:34,130|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|18/06/22 10:22:34 [main]: 
> WARN jdbc.HiveConnection: Failed to connect to 
> ctr-e138-1518143905142-375925-01-000004.hwx.site:10001
> 2018-06-22 10:22:34,244|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|18/06/22 10:22:34 [main]: 
> WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: 
> jdbc:hive2://ctr-e138-1518143905142-375925-01-000004.hwx.site:10001/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/_h...@example.com;transportMode=http;httpPath=cliservice;ssl=true;sslTrustStore=/etc/security/serverKeys/hivetruststore.jks;trustStorePassword=changeit:
>  Failed to open new session: 
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:java.security.AccessControlException: Permission 
> denied: user=hrt_qa, access=READ, 
> inode="/warehouse/tablespace/managed/hive":hive:hadoop:drwx------
> {code}
> warehouse directory - 
> {code}
> -bash-4.2$ hdfs dfs -ls /warehouse/tablespace/
> Found 2 items
> drwxr-xr-x   - hdfs hdfs          0 2018-06-22 07:01 
> /warehouse/tablespace/external
> drwxr-xr-x   - hdfs hdfs          0 2018-06-22 07:01 
> /warehouse/tablespace/managed
> -bash-4.2$ hdfs dfs -ls /warehouse/tablespace/managed/hive
> Found 5 items
> drwxrwx---+  - hive hadoop          0 2018-06-22 09:28 
> /warehouse/tablespace/managed/hive/all10kw
> drwxrwx---+  - hive hadoop          0 2018-06-22 09:24 
> /warehouse/tablespace/managed/hive/hive8295
> drwxrwx---+  - hive hadoop          0 2018-06-22 07:20 
> /warehouse/tablespace/managed/hive/information_schema.db
> drwxrwxrwx+  - hive hadoop          0 2018-06-22 07:20 
> /warehouse/tablespace/managed/hive/sys.db
> drwxrwx---+  - hive hadoop          0 2018-06-22 09:27 
> /warehouse/tablespace/managed/hive/tbl1002
> -bash-4.2$ hdfs dfs -ls /warehouse/tablespace/external/hive
> Found 2 items
> drwxr-xr-x+  - hive hadoop          0 2018-06-22 07:02 
> /warehouse/tablespace/external/hive/sys.db
> drwxrwxrwx+  - hive hadoop          0 2018-06-22 10:12 
> /warehouse/tablespace/external/hive/test_table
> -bash-4.2$ exit
> logout
> {code}
> It looks like the code still assumes external tables to be present under 
> '/warehouse/tablespace/managed' directory similar to earlier 
> '/apps/hive/warehouse'. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to