[
https://issues.apache.org/jira/browse/HIVE-22758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Chiran Ravani updated HIVE-22758:
---------------------------------
Description:
With doAs set to true, running create database on external location fails with
permission denied for write access on the directory for hive user (User HMS is
running as).
Steps to reproduce the issue:
1. Turn on, Hive run as end-user to true.
2. Connect to hive as some user other than admin, eg:- chiran
3. Create a database with external location
{code}
create database externaldbexample location '/user/chiran/externaldbexample'
{code}
The above statement fails as write access is not available to hive service user
on HDFS as below.
{code}
> create database externaldbexample location '/user/chiran/externaldbexample';
INFO : Compiling
command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d):
create database externaldbexample location '/user/chiran/externaldbexample'
INFO : Semantic Analysis Completed (retrial = false)
INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
INFO : Completed compiling
command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d); Time
taken: 1.377 seconds
INFO : Executing
command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d):
create database externaldbexample location '/user/chiran/externaldbexample'
INFO : Starting task [Stage-0:DDL] in serial mode
ERROR : FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:java.lang.reflect.UndeclaredThrowableException)
INFO : Completed executing
command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d); Time
taken: 0.238 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 1
from org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:java.lang.reflect.UndeclaredThrowableException)
(state=08S01,code=1)
{code}
>From Hive Metastore service log, below is seen.
{code}
2020-01-22T04:36:27,870 WARN [pool-6-thread-6]: metastore.ObjectStore
(ObjectStore.java:getDatabase(1010)) - Failed to get database
hive.externaldbexample, returning NoSuchObjectExcept
ion
2020-01-22T04:36:27,898 INFO [pool-6-thread-6]: metastore.HiveMetaStore
(HiveMetaStore.java:run(1339)) - Creating database path in managed directory
hdfs://c470-node2.squadron.support.
hortonworks.com:8020/user/chiran/externaldbexample
2020-01-22T04:36:27,903 INFO [pool-6-thread-6]: utils.FileUtils
(FileUtils.java:mkdir(170)) - Creating directory if it doesn't exist:
hdfs://namenodeaddress:8020/user/chiran/externaldbexample
2020-01-22T04:36:27,932 ERROR [pool-6-thread-6]: utils.MetaStoreUtils
(MetaStoreUtils.java:logAndThrowMetaException(169)) - Got exception:
org.apache.hadoop.security.AccessControlException Permission denied: user=hive,
access=WRITE, inode="/user/chiran":chiran:chiran:drwxr-xr-x
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1859)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1843)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1802)
at
org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:59)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3150)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1126)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:707)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)
{code}
The behavior looks to be a regression of HIVE-20001.
Can we add a check to see if the external path specified by a user, then create
DB directory as end-user instead of hive service user?
Attaching patch, which was tested locally on Hive 3.1
was:
With doAs set to true, running create database on external location fails due
to permission denied to write on directory specified for hive user (User with
HMS is running).
Steps to reproduce the issue:
1. Turn on, Hive run as end-user to true.
2. Connect to hive as some user other than admin, eg:- chiran
3. Create a database with external location
{code}
create database externaldbexample location '/user/chiran/externaldbexample'
{code}
The above statement fails with HDFS write permission denied error as below.
{code}
> create database externaldbexample location '/user/chiran/externaldbexample';
INFO : Compiling
command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d):
create database externaldbexample location '/user/chiran/externaldbexample'
INFO : Semantic Analysis Completed (retrial = false)
INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
INFO : Completed compiling
command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d); Time
taken: 1.377 seconds
INFO : Executing
command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d):
create database externaldbexample location '/user/chiran/externaldbexample'
INFO : Starting task [Stage-0:DDL] in serial mode
ERROR : FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:java.lang.reflect.UndeclaredThrowableException)
INFO : Completed executing
command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d); Time
taken: 0.238 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 1
from org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:java.lang.reflect.UndeclaredThrowableException)
(state=08S01,code=1)
{code}
>From Hive Metastore service log, below is seen.
{code}
2020-01-22T04:36:27,870 WARN [pool-6-thread-6]: metastore.ObjectStore
(ObjectStore.java:getDatabase(1010)) - Failed to get database
hive.externaldbexample, returning NoSuchObjectExcept
ion
2020-01-22T04:36:27,898 INFO [pool-6-thread-6]: metastore.HiveMetaStore
(HiveMetaStore.java:run(1339)) - Creating database path in managed directory
hdfs://c470-node2.squadron.support.
hortonworks.com:8020/user/chiran/externaldbexample
2020-01-22T04:36:27,903 INFO [pool-6-thread-6]: utils.FileUtils
(FileUtils.java:mkdir(170)) - Creating directory if it doesn't exist:
hdfs://namenodeaddress:8020/user/chiran/externaldbexample
2020-01-22T04:36:27,932 ERROR [pool-6-thread-6]: utils.MetaStoreUtils
(MetaStoreUtils.java:logAndThrowMetaException(169)) - Got exception:
org.apache.hadoop.security.AccessControlException Permission denied: user=hive,
access=WRITE, inode="/user/chiran":chiran:chiran:drwxr-xr-x
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1859)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1843)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1802)
at
org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:59)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3150)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1126)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:707)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)
{code}
The behaviour looks to be a regression of HIVE-20001.
Can we add a check to see if the external path specified by a user, then create
DB directory as end-user instead of hive service user?
Attaching patch, which was tested locally on Hive 3.1
> Create database with permission error when doas set to true
> -----------------------------------------------------------
>
> Key: HIVE-22758
> URL: https://issues.apache.org/jira/browse/HIVE-22758
> Project: Hive
> Issue Type: Bug
> Components: Standalone Metastore
> Affects Versions: 3.0.0, 3.1.0
> Reporter: Chiran Ravani
> Assignee: Chiran Ravani
> Priority: Critical
> Attachments: HIVE-22758.1.patch
>
>
> With doAs set to true, running create database on external location fails
> with permission denied for write access on the directory for hive user (User
> HMS is running as).
> Steps to reproduce the issue:
> 1. Turn on, Hive run as end-user to true.
> 2. Connect to hive as some user other than admin, eg:- chiran
> 3. Create a database with external location
> {code}
> create database externaldbexample location '/user/chiran/externaldbexample'
> {code}
> The above statement fails as write access is not available to hive service
> user on HDFS as below.
> {code}
> > create database externaldbexample location '/user/chiran/externaldbexample';
> INFO : Compiling
> command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d):
> create database externaldbexample location '/user/chiran/externaldbexample'
> INFO : Semantic Analysis Completed (retrial = false)
> INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
> INFO : Completed compiling
> command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d);
> Time taken: 1.377 seconds
> INFO : Executing
> command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d):
> create database externaldbexample location '/user/chiran/externaldbexample'
> INFO : Starting task [Stage-0:DDL] in serial mode
> ERROR : FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask.
> MetaException(message:java.lang.reflect.UndeclaredThrowableException)
> INFO : Completed executing
> command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d);
> Time taken: 0.238 seconds
> Error: Error while processing statement: FAILED: Execution Error, return code
> 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
> MetaException(message:java.lang.reflect.UndeclaredThrowableException)
> (state=08S01,code=1)
> {code}
> From Hive Metastore service log, below is seen.
> {code}
> 2020-01-22T04:36:27,870 WARN [pool-6-thread-6]: metastore.ObjectStore
> (ObjectStore.java:getDatabase(1010)) - Failed to get database
> hive.externaldbexample, returning NoSuchObjectExcept
> ion
> 2020-01-22T04:36:27,898 INFO [pool-6-thread-6]: metastore.HiveMetaStore
> (HiveMetaStore.java:run(1339)) - Creating database path in managed directory
> hdfs://c470-node2.squadron.support.
> hortonworks.com:8020/user/chiran/externaldbexample
> 2020-01-22T04:36:27,903 INFO [pool-6-thread-6]: utils.FileUtils
> (FileUtils.java:mkdir(170)) - Creating directory if it doesn't exist:
> hdfs://namenodeaddress:8020/user/chiran/externaldbexample
> 2020-01-22T04:36:27,932 ERROR [pool-6-thread-6]: utils.MetaStoreUtils
> (MetaStoreUtils.java:logAndThrowMetaException(169)) - Got exception:
> org.apache.hadoop.security.AccessControlException Permission denied:
> user=hive, access=WRITE, inode="/user/chiran":chiran:chiran:drwxr-xr-x
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
> at
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1859)
> at
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1843)
> at
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1802)
> at
> org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:59)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3150)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1126)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:707)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
> at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
> at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)
> {code}
> The behavior looks to be a regression of HIVE-20001.
> Can we add a check to see if the external path specified by a user, then
> create DB directory as end-user instead of hive service user?
> Attaching patch, which was tested locally on Hive 3.1
--
This message was sent by Atlassian Jira
(v8.3.4#803005)