[ 
https://issues.apache.org/jira/browse/AMBARI-22435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16251716#comment-16251716
 ] 

Hadoop QA commented on AMBARI-22435:
------------------------------------

{color:red}-1 overall{color}.  Here are the results of testing the latest 
attachment 
  
http://issues.apache.org/jira/secure/attachment/12897443/AMBARI-22435-trunk.patch
  against trunk revision .

    {color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

    {color:red}-1 tests included{color}.  The patch doesn't appear to include 
any new or modified tests.
                        Please justify why no new tests are needed for this 
patch.
                        Also please list what manual steps were performed to 
verify this patch.

    {color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

    {color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

    {color:red}-1 core tests{color}.  The test build failed in 
[ambari-server|https://builds.apache.org/job/Ambari-trunk-test-patch/12667//artifact/patch-work/testrun_ambari-server.txt]
 

Console output: 
https://builds.apache.org/job/Ambari-trunk-test-patch/12667//console

This message is automatically generated.

> Hive services sets incorrect permissions on the root of HDFS while setting up 
> replication
> -----------------------------------------------------------------------------------------
>
>                 Key: AMBARI-22435
>                 URL: https://issues.apache.org/jira/browse/AMBARI-22435
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.6.0
>            Reporter: Deepesh Khandelwal
>            Priority: Critical
>             Fix For: trunk
>
>         Attachments: AMBARI-22435-trunk.patch
>
>
> While starting Hive metastore or HiveServer2 observed that the hdfs audit log 
> had the following entries:
> {noformat}
> 2017-11-14 01:48:03,682 INFO FSNamesystem.audit: allowed=true   ugi=hdfs 
> (auth:SIMPLE)  ip=/172.27.70.128       cmd=getfileinfo src=/   dst=null       
>  perm=null       proto=webhdfs
> 2017-11-14 01:48:03,745 INFO FSNamesystem.audit: allowed=true   ugi=hdfs 
> (auth:SIMPLE)  ip=/172.27.70.128       cmd=setPermission       src=/   
> dst=null        perm=hdfs:hdfs:rwxrwxrwt        proto=webhdfs
> 2017-11-14 01:48:03,809 INFO FSNamesystem.audit: allowed=true   ugi=hdfs 
> (auth:SIMPLE)  ip=/172.27.70.128       cmd=setOwner    src=/   dst=null       
>  perm=hive:hadoop:rwxrwxrwt      proto=webhdfs
> 2017-11-14 01:48:04,126 INFO FSNamesystem.audit: allowed=true   ugi=hdfs 
> (auth:SIMPLE)  ip=/172.27.70.128       cmd=getfileinfo src=/   dst=null       
>  perm=null       proto=webhdfs
> 2017-11-14 01:48:04,189 INFO FSNamesystem.audit: allowed=true   ugi=hdfs 
> (auth:SIMPLE)  ip=/172.27.70.128       cmd=setPermission       src=/   
> dst=null        perm=hive:hadoop:rwx------      proto=webhdfs
> {noformat}
> The consequence of this was that for any non-hdfs user you will see the 
> following error running hadoop commands:
> {noformat}
> $ hadoop fs -ls /
> ls: Permission denied: user=root, access=READ_EXECUTE, 
> inode="/":hive:hadoop:drwx------
> {noformat}
> On debugging this was happening due to incomplete null/empty check on params 
> hive_repl_cmrootdir and hive_repl_rootdir in 
> ambari/blob/trunk/ambari-server/src/main/resources/common-services/HIVE/2.1.0.3.0/package/scripts/hive.py.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to