-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/51901/
-----------------------------------------------------------

(Updated Sept. 14, 2016, 10:23 p.m.)


Review request for Ambari, Alejandro Fernandez and Sumit Mohanty.


Changes
-------

Updated testing Section.


Bugs: AMBARI-18393
    https://issues.apache.org/jira/browse/AMBARI-18393


Repository: ambari


Description
-------

**Issue:**

- Install Cluster using Blueprint including HiveServerInteractive
- Start services fail at Hive interactive start, with below error:

======================================================================================================
--
Traceback (most recent call last):
  File 
"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py",
 line 535, in <module>
    HiveServerInteractive().execute()
  File 
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
 line 280, in execute
    method(env)
  File 
"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py",
 line 115, in start
    self.setup_security()
  File 
"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server_interactive.py",
 line 335, in setup_security
    Execute(slider_keytab_install_cmd, user=params.hive_user)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
line 155, in __init__
    self.env.run()
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 160, in run
    self.run_action(resource, action)
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 124, in run_action
    provider_action()
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
 line 273, in action_run
    tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 71, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 93, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 141, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 294, in _call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'slider install-keytab 
--keytab /etc/security/keytabs/hive.llap.zk.sm.keytab --folder hive 
--overwrite' returned 56. 2016-08-15 23:47:57,518 [main] INFO  
tools.SliderUtils - JVM initialized into secure mode with kerberos realm 
HWQE.HORTONWORKS.COM
2016-08-15 23:47:59,108 [main] INFO  impl.TimelineClientImpl - Timeline service 
address: http://nat-s11-4-lkws-stackdeploy-3.openstacklocal:8188/ws/v1/timeline/
2016-08-15 23:48:01,584 [main] WARN  shortcircuit.DomainSocketFactory - The 
short-circuit local reads feature cannot be used because libhadoop cannot be 
loaded.
2016-08-15 23:48:01,633 [main] INFO  client.RMProxy - Connecting to 
ResourceManager at 
nat-s11-4-lkws-stackdeploy-5.openstacklocal/172.22.71.181:8050
2016-08-15 23:48:01,983 [main] INFO  client.AHSProxy - Connecting to 
Application History server at 
nat-s11-4-lkws-stackdeploy-3.openstacklocal/172.22.71.168:10200
2016-08-15 23:48:03,297 [main] WARN  client.SliderClient - The 'install-keytab' 
option has been deprecated.  Please use 'keytab --install'.
2016-08-15 23:48:03,440 [main] WARN  retry.RetryInvocationHandler - Exception 
while invoking ClientNamenodeProtocolTranslatorPB.mkdirs over null. Not 
retrying because try once and fail.
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
 Permission denied: user=hive, access=WRITE, 
inode="/user/hive/.slider/keytabs/hive":hdfs:hdfs:drwxr-xr-x
        at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
        at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
        at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213)
        at 
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1811)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1794)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4011)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1102)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:630)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
{code}


======================================================================================================
--


**Fix:**

- Added 'user/<hive-user>' creation in HSI also.


Diffs
-----

  
ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/hive_interactive.py
 74c67fc 
  ambari-server/src/test/python/stacks/2.5/HIVE/test_hive_server_int.py b97c377 

Diff: https://reviews.apache.org/r/51901/diff/


Testing (updated)
-------

Manual Testing:

2 ways:

- The Hive Server, Hive Metastore and WebHCat Server install commands were 
skipped during the INSTALL and their START. After that, installed HSI, it 
created the HDFS's '/user/<hive-user> dir, and HSI INSTALL and START passed.

- The Hive Server, Hive Metastore and WebHCat Server INSTALL and START was 
done. After that, HDFS dir '/use/hive' dit was deleted. Then installed HSI, it 
created the HDFS's '/user/<hive-user> dir, and HSI INSTALL and START passed.

Python UT : Passes

Jenkins : pending


Thanks,

Swapan Shridhar

Reply via email to