[ 
https://issues.apache.org/jira/browse/AMBARI-16234?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15269833#comment-15269833
 ] 

Hadoop QA commented on AMBARI-16234:
------------------------------------

{color:red}-1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12802036/AMBARI-16234.patch
  against trunk revision .

    {color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

    {color:red}-1 tests included{color}.  The patch doesn't appear to include 
any new or modified tests.
                        Please justify why no new tests are needed for this 
patch.
                        Also please list what manual steps were performed to 
verify this patch.

    {color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

    {color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

    {color:green}+1 core tests{color}.  The patch passed unit tests in 
ambari-server.

Test results: 
https://builds.apache.org/job/Ambari-trunk-test-patch/6759//testReport/
Console output: 
https://builds.apache.org/job/Ambari-trunk-test-patch/6759//console

This message is automatically generated.

> WebHCat service check fails since pig tarball copied to incorrect location
> --------------------------------------------------------------------------
>
>                 Key: AMBARI-16234
>                 URL: https://issues.apache.org/jira/browse/AMBARI-16234
>             Project: Ambari
>          Issue Type: Bug
>          Components: stacks
>    Affects Versions: 2.4.0
>            Reporter: Jayush Luniya
>            Assignee: Jayush Luniya
>             Fix For: 2.4.0
>
>         Attachments: AMBARI-16234.patch
>
>
> Service Check for WebHCat is failing.
> ambari-server-2.4.0.0-4910.x86_64
> ambari-server --hash
> acfa1c0e7a9b8513eb74747008a43d70728e07bb
> Pig tarball was copied to the wrong location
> {code}
> # Expected location
> [root@c6404 ~]# su hdfs -c "hdfs dfs -ls hdfs:///hdp/apps/2.5.0.0-267"
> Found 3 items
> dr-xr-xr-x   - hdfs hdfs          0 2016-04-25 23:20 
> hdfs:///hdp/apps/2.5.0.0-267/mapreduce
> dr-xr-xr-x   - hdfs hdfs          0 2016-04-26 20:32 
> hdfs:///hdp/apps/2.5.0.0-267/slider
> dr-xr-xr-x   - hdfs hdfs          0 2016-04-26 20:32 
> hdfs:///hdp/apps/2.5.0.0-267/tez
> # Incorrect location
> [root@c6404 ~]# su hdfs -c "hdfs dfs -ls 
> hdfs:///HDP/apps/2.5.0.0-267/pig/pig.tar.gz"
> -r--r--r--   3 hdfs hadoop   98902307 2016-04-26 20:33 
> hdfs:///HDP/apps/2.5.0.0-267/pig/pig.tar.gz
> It does exist on host with Hive Server,
> [root@c6404 ~]# ls -la /usr/hdp/current/pig-client/pig.tar.gz
> -rw-r--r-- 1 root root 98902307 Apr 24 14:00 
> /usr/hdp/current/pig-client/pig.tar.gz
> {code}
> {code}
> Traceback (most recent call last):
>   File 
> "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/service_check.py",
>  line 167, in <module>
>     HiveServiceCheck().execute()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 248, in execute
>     method(env)
>   File 
> "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/service_check.py",
>  line 95, in service_check
>     webhcat_service_check()
>   File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", 
> line 89, in thunk
>     return fn(*args, **kwargs)
>   File 
> "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_service_check.py",
>  line 125, in webhcat_service_check
>     logoutput=True)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
> line 155, in __init__
>     self.env.run()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
>     self.run_action(resource, action)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
>     provider_action()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 273, in action_run
>     tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 70, in inner
>     result = function(command, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 92, in checked_call
>     tries=tries, try_sleep=try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 140, in _call_wrapper
>     result = _call(command, **kwargs_copy)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 293, in _call
>     raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of 
> '/var/lib/ambari-agent/tmp/templetonSmoke.sh c6404.ambari.apache.org 
> ambari-qa 50111 idtest.ambari-qa.1461702877.77.pig no_keytab false kinit 
> no_principal /var/lib/ambari-agent/tmp' returned 1. Templeton Smoke Test (pig 
> cmd): Failed. : {"error":"File hdfs:///hdp/apps/2.5.0.0-267/pig/pig.tar.gz 
> does not exist."}http_code <500>
> {code}
> This is because the code in params_linux.py of Hive should be casting to 
> lowercase the stack_name
> {code}
> hive_tar_source = "{0}/{1}/hive/hive.tar.gz".format(stack_root, 
> STACK_VERSION_PATTERN)
>   pig_tar_source = "{0}/{1}/pig/pig.tar.gz".format(stack_root, 
> STACK_VERSION_PATTERN)
>   hive_tar_dest_file = "/{0}/apps/{1}/hive/hive.tar.gz".format(stack_name, 
> STACK_VERSION_PATTERN)
>   pig_tar_dest_file = "/{0}/apps/{1}/pig/pig.tar.gz".format(stack_name, 
> STACK_VERSION_PATTERN)
>   hadoop_streaming_tar_source = 
> "{0}/{1}/hadoop-mapreduce/hadoop-streaming.jar".format(stack_root, 
> STACK_VERSION_PATTERN)
>   sqoop_tar_source = "{0}/{1}/sqoop/sqoop.tar.gz".format(stack_root, 
> STACK_VERSION_PATTERN)
>   hadoop_streaming_tar_dest_dir = 
> "/{0}/apps/{1}/mapreduce/".format(stack_name, STACK_VERSION_PATTERN)
>   sqoop_tar_dest_dir = "/{0}/apps/{1}/sqoop/".format(stack_name, 
> STACK_VERSION_PATTERN)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to