[ 
https://issues.apache.org/jira/browse/AMBARI-20553?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15939556#comment-15939556
 ] 

Hadoop QA commented on AMBARI-20553:
------------------------------------

{color:red}-1 overall{color}.  Here are the results of testing the latest 
attachment 
  http://issues.apache.org/jira/secure/attachment/12860255/AMBARI-20553.patch
  against trunk revision .

    {color:green}+1 @author{color}.  The patch does not contain any @author 
tags.

    {color:red}-1 tests included{color}.  The patch doesn't appear to include 
any new or modified tests.
                        Please justify why no new tests are needed for this 
patch.
                        Also please list what manual steps were performed to 
verify this patch.

    {color:green}+1 javac{color}.  The applied patch does not increase the 
total number of javac compiler warnings.

    {color:green}+1 release audit{color}.  The applied patch does not increase 
the total number of release audit warnings.

    {color:green}+1 core tests{color}.  The patch passed unit tests in 
ambari-metrics/ambari-metrics-timelineservice ambari-server.

Test results: 
https://builds.apache.org/job/Ambari-trunk-test-patch/11148//testReport/
Console output: 
https://builds.apache.org/job/Ambari-trunk-test-patch/11148//console

This message is automatically generated.

> Ambari script error for ams-hbase while writing to Amazon s3 on a cluster 
> with no HDFS.
> ---------------------------------------------------------------------------------------
>
>                 Key: AMBARI-20553
>                 URL: https://issues.apache.org/jira/browse/AMBARI-20553
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-metrics
>    Affects Versions: 2.5.1
>            Reporter: Aravindan Vijayan
>            Assignee: Aravindan Vijayan
>            Priority: Blocker
>             Fix For: 2.5.1
>
>         Attachments: AMBARI-20553.patch
>
>
> Metrics collector startup scripts fail to handle s3 paths in configurations:
> {noformat}
>       {
>         "hbase-site":{
>           "properties":{
>             "hbase.rootdir":"s3a://ss-datasets/apps/hbase/",
>             "hbase.wal.dir":"file:///usr/lib/ams-hbase/data"
>           }
>         }
> {noformat}
> {noformat}
> Traceback (most recent call last):
>   File 
> "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/metrics_collector.py",
>  line 150, in <module>
>     AmsCollector().execute()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 313, in execute
>     method(env)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 766, in restart
>     self.start(env)
>   File 
> "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/metrics_collector.py",
>  line 48, in start
>     self.configure(env, action = 'start') # for security
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 116, in locking_configure
>     original_configure(obj, *args, **kw)
>   File 
> "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/metrics_collector.py",
>  line 43, in configure
>     hbase('master', action)
>   File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", 
> line 89, in thunk
>     return fn(*args, **kwargs)
>   File 
> "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/hbase.py",
>  line 213, in hbase
>     dfs_type=params.dfs_type
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
> line 155, in __init__
>     self.env.run()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
>     self.run_action(resource, action)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
>     provider_action()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py",
>  line 555, in action_create_on_execute
>     self.action_delayed("create")
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py",
>  line 552, in action_delayed
>     self.get_hdfs_resource_executor().action_delayed(action_name, self)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py",
>  line 279, in action_delayed
>     self._assert_valid()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py",
>  line 238, in _assert_valid
>     self.target_status = self._get_file_status(target)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py",
>  line 381, in _get_file_status
>     list_status = self.util.run_command(target, 'GETFILESTATUS', 
> method='GET', ignore_status_codes=['404'], assertable_result=False)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py",
>  line 186, in run_command
>     _, out, err = get_user_call_output(cmd, user=self.run_user, 
> logoutput=self.logoutput, quiet=False)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/get_user_call_output.py",
>  line 61, in get_user_call_output
>     raise ExecutionFailed(err_msg, code, files_output[0], files_output[1])
> resource_management.core.exceptions.ExecutionFailed: Execution of 'curl -sS 
> -L -w '%{http_code}' -X GET 
> 'http://<host>:50070/webhdfs/v1s3a:/ss-datasets/apps/hbase?op=GETFILESTATUS&user.name=hdfs'
>  1>/tmp/tmpjkn3uB 2>/tmp/tmpCVb8Kl' returned 7. curl: (7) Failed to connect 
> to <host> port 50070: Connection refused
> 000
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to