[ 
https://issues.apache.org/jira/browse/AMBARI-17276?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15335915#comment-15335915
 ] 

Hudson commented on AMBARI-17276:
---------------------------------

FAILURE: Integrated in Ambari-trunk-Commit #5102 (See 
[https://builds.apache.org/job/Ambari-trunk-Commit/5102/])
AMBARI-17276. Zeppelin: Intermittent failure while downloading example 
(dipayan.bhowmick: 
[http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=fc52d7158fc44651d8f00d64c351f2602cc6dbf2])
* 
ambari-server/src/main/resources/common-services/ZEPPELIN/0.6.0.2.5/package/scripts/setup_snapshot.sh


> Zeppelin: Intermittent failure while downloading example notebooks
> ------------------------------------------------------------------
>
>                 Key: AMBARI-17276
>                 URL: https://issues.apache.org/jira/browse/AMBARI-17276
>             Project: Ambari
>          Issue Type: Bug
>    Affects Versions: ambari-2.4.0
>            Reporter: Renjith Kamath
>            Assignee: Renjith Kamath
>             Fix For: ambari-2.4.0
>
>         Attachments: AMBARI-17276_trunk+branch-2.4_v1.patch
>
>
> {code:title=stderr}
> Traceback (most recent call last):
>   File 
> "/var/lib/ambari-agent/cache/common-services/ZEPPELIN/0.6.0.2.5/package/scripts/master.py",
>  line 203, in <module>
>     Master().execute()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 257, in execute
>     method(env)
>   File 
> "/var/lib/ambari-agent/cache/common-services/ZEPPELIN/0.6.0.2.5/package/scripts/master.py",
>  line 58, in install
>     user=params.zeppelin_user)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
> line 155, in __init__
>     self.env.run()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
>     self.run_action(resource, action)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
>     provider_action()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 273, in action_run
>     tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 70, in inner
>     result = function(command, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 92, in checked_call
>     tries=tries, try_sleep=try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 140, in _call_wrapper
>     result = _call(command, **kwargs_copy)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 293, in _call
>     raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of 
> '/var/lib/ambari-agent/cache/common-services/ZEPPELIN/0.6.0.2.5/package/scripts/setup_snapshot.sh
>  /usr/hdp/current/zeppelin-server c6401.ambari.apache.org 9083 10001 
> c6401.ambari.apache.org 9995 True 
> /var/lib/ambari-agent/cache/common-services/ZEPPELIN/0.6.0.2.5/package 
> /usr/jdk64/jdk1.8.0_66 >> /var/log/zeppelin/zeppelin-setup.log' returned 1. 
> --2016-06-16 11:32:11--  
> https://github.com/hortonworks-gallery/zeppelin-notebooks/archive/master.zip
> Resolving github.com... 192.30.252.129
> Connecting to github.com|192.30.252.129|:443... connected.
> HTTP request sent, awaiting response... 302 Found
> Location: 
> https://codeload.github.com/hortonworks-gallery/zeppelin-notebooks/zip/master 
> [following]
> --2016-06-16 11:32:13--  
> https://codeload.github.com/hortonworks-gallery/zeppelin-notebooks/zip/master
> Resolving codeload.github.com... 192.30.253.121
> Connecting to codeload.github.com|192.30.253.121|:443... connected.
> HTTP request sent, awaiting response... 200 OK
> Length: unspecified [application/zip]
> Saving to: “notebooks.zip”
>      0K .......... .......... .......... .......... .......... 83.0K
>     50K .......... .......... .......... .......... .......... 79.2K
>    100K .......... .......... .......... .......... ..........  164K
>    150K .......... .......... .......... .......... .......... 87.8K
>    200K .......... .......... .......... .......... ..........  176K
>    250K .......... .......... .......... .......... ..........  175K
>    300K .......... .......... .......... .......... ..........  167K
>    350K .......... .......... .......... .......... ..........  177K
>    400K .......... .......... .......... .......... ..........  174K
>    450K .......... .......... .......... .......... ..........  174K
>    500K .......... .......... .......... .......... ..........  182K
>    550K .......... .......... .......... .......... ..........  174K
>    600K .......... .......... .......... .......... ..........  176K
>    650K .......... .......... .......... .......... ..........  164K
>    700K .......... .......... .......... .......... ..........  174K
>    750K .......... .......... .......... .......... ..........  179K
>    800K .......... .......... .......... .....                  164K=5.8s
> 2016-06-16 11:32:28 (145 KB/s) - “notebooks.zip” saved [855111]
> warning [notebooks.zip]:  7 extra bytes at beginning or within zipfile
>   (attempting to process anyway)
> {code}
> {code:title=stdout}
> 2016-06-16 11:19:04,892 - Using hadoop conf dir: 
> /usr/hdp/current/hadoop-client/conf
> 2016-06-16 11:19:04,893 - Group['livy'] {}
> 2016-06-16 11:19:04,895 - Group['spark'] {}
> 2016-06-16 11:19:04,895 - Group['hadoop'] {}
> 2016-06-16 11:19:04,895 - Group['users'] {}
> 2016-06-16 11:19:04,895 - User['hive'] {'gid': 'hadoop', 
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
> 2016-06-16 11:19:04,896 - User['livy'] {'gid': 'hadoop', 
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
> 2016-06-16 11:19:04,896 - User['zookeeper'] {'gid': 'hadoop', 
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
> 2016-06-16 11:19:04,897 - User['spark'] {'gid': 'hadoop', 
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
> 2016-06-16 11:19:04,897 - User['ambari-qa'] {'gid': 'hadoop', 
> 'fetch_nonlocal_groups': True, 'groups': ['users']}
> 2016-06-16 11:19:04,897 - User['tez'] {'gid': 'hadoop', 
> 'fetch_nonlocal_groups': True, 'groups': ['users']}
> 2016-06-16 11:19:04,898 - User['hdfs'] {'gid': 'hadoop', 
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
> 2016-06-16 11:19:04,898 - User['yarn'] {'gid': 'hadoop', 
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
> 2016-06-16 11:19:04,899 - User['hcat'] {'gid': 'hadoop', 
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
> 2016-06-16 11:19:04,899 - User['mapred'] {'gid': 'hadoop', 
> 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
> 2016-06-16 11:19:04,899 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] 
> {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2016-06-16 11:19:04,901 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh 
> ambari-qa 
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
>  {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
> 2016-06-16 11:19:04,904 - Skipping 
> Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa 
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']
>  due to not_if
> 2016-06-16 11:19:04,904 - Group['hdfs'] {}
> 2016-06-16 11:19:04,905 - User['hdfs'] {'fetch_nonlocal_groups': True, 
> 'groups': ['hadoop', 'hdfs']}
> 2016-06-16 11:19:04,906 - FS Type: 
> 2016-06-16 11:19:04,906 - Directory['/etc/hadoop'] {'mode': 0755}
> 2016-06-16 11:19:04,918 - 
> File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': 
> InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
> 2016-06-16 11:19:04,918 - 
> Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 
> 'hdfs', 'group': 'hadoop', 'mode': 01777}
> 2016-06-16 11:19:04,934 - Initializing 2 repositories
> 2016-06-16 11:19:04,935 - Repository['HDP-2.5'] {'base_url': 
> 'http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos6/2.x/BUILDS/2.5.0.0-723',
>  'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': 
> '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list 
> %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif 
> %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': 
> None}
> 2016-06-16 11:19:04,941 - File['/etc/yum.repos.d/HDP.repo'] {'content': 
> '[HDP-2.5]\nname=HDP-2.5\nbaseurl=http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos6/2.x/BUILDS/2.5.0.0-723\n\npath=/\nenabled=1\ngpgcheck=0'}
> 2016-06-16 11:19:04,941 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 
> 'http://s3.amazonaws.com/dev.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6',
>  'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': 
> '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list 
> %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif 
> %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 
> 'mirror_list': None}
> 2016-06-16 11:19:04,944 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': 
> '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://s3.amazonaws.com/dev.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'}
> 2016-06-16 11:19:04,945 - Package['unzip'] {'retry_on_repo_unavailability': 
> False, 'retry_count': 5}
> 2016-06-16 11:19:04,988 - Skipping installation of existing package unzip
> 2016-06-16 11:19:04,988 - Package['curl'] {'retry_on_repo_unavailability': 
> False, 'retry_count': 5}
> 2016-06-16 11:19:04,994 - Skipping installation of existing package curl
> 2016-06-16 11:19:04,994 - Package['hdp-select'] 
> {'retry_on_repo_unavailability': False, 'retry_count': 5}
> 2016-06-16 11:19:05,001 - Skipping installation of existing package hdp-select
> 2016-06-16 11:19:05,124 - call['ambari-python-wrap /usr/bin/hdp-select status 
> spark-client'] {'timeout': 20}
> 2016-06-16 11:19:05,142 - call returned (0, 'spark-client - 2.5.0.0-723')
> 2016-06-16 11:19:05,144 - Using hadoop conf dir: 
> /usr/hdp/current/hadoop-client/conf
> 2016-06-16 11:19:05,145 - Execute['chmod a+x 
> /var/lib/ambari-agent/cache/common-services/ZEPPELIN/0.6.0.2.5/package/scripts/setup_snapshot.sh']
>  {}
> 2016-06-16 11:19:05,154 - checked_call['rpm -q --queryformat 
> '%{version}-%{release}' hdp-select | sed -e 's/\.el[0-9]//g''] {'stderr': -1}
> 2016-06-16 11:19:05,167 - checked_call returned (0, '2.5.0.0-723', '')
> 2016-06-16 11:19:05,168 - Package['zeppelin_2_5_0_0_723'] 
> {'retry_on_repo_unavailability': False, 'retry_count': 5}
> 2016-06-16 11:19:05,210 - Installing package zeppelin_2_5_0_0_723 
> ('/usr/bin/yum -d 0 -e 0 -y install zeppelin_2_5_0_0_723')
> 2016-06-16 11:32:11,743 - Directory['/var/run/zeppelin'] {'owner': 
> 'zeppelin', 'group': 'zeppelin', 'mode': 0755, 'cd_access': 'a'}
> 2016-06-16 11:32:11,744 - Directory['/usr/hdp/current/zeppelin-server'] 
> {'owner': 'zeppelin', 'group': 'zeppelin', 'mode': 0755, 'cd_access': 'a'}
> 2016-06-16 11:32:11,744 - Changing owner for /usr/hdp/current/zeppelin-server 
> from 0 to zeppelin
> 2016-06-16 11:32:11,744 - Changing group for /usr/hdp/current/zeppelin-server 
> from 0 to zeppelin
> 2016-06-16 11:32:11,744 - Execute['echo spark_version:1.6 detected for 
> spark_home: /usr/hdp/current/spark-client >> 
> /var/log/zeppelin/zeppelin-setup.log'] {'user': 'zeppelin'}
> 2016-06-16 11:32:11,763 - Directory['/var/log/zeppelin'] {'owner': 
> 'zeppelin', 'group': 'zeppelin', 'mode': 0755, 'cd_access': 'a'}
> 2016-06-16 11:32:11,764 - XmlConfig['zeppelin-site.xml'] {'owner': 
> 'zeppelin', 'group': 'zeppelin', 'conf_dir': 
> '/usr/hdp/current/zeppelin-server/conf', 'configurations': ...}
> 2016-06-16 11:32:11,772 - Generating config: 
> /usr/hdp/current/zeppelin-server/conf/zeppelin-site.xml
> 2016-06-16 11:32:11,772 - 
> File['/usr/hdp/current/zeppelin-server/conf/zeppelin-site.xml'] {'owner': 
> 'zeppelin', 'content': InlineTemplate(...), 'group': 'zeppelin', 'mode': 
> None, 'encoding': 'UTF-8'}
> 2016-06-16 11:32:11,783 - Writing 
> File['/usr/hdp/current/zeppelin-server/conf/zeppelin-site.xml'] because it 
> doesn't exist
> 2016-06-16 11:32:11,783 - Changing owner for 
> /usr/hdp/current/zeppelin-server/conf/zeppelin-site.xml from 0 to zeppelin
> 2016-06-16 11:32:11,783 - Changing group for 
> /usr/hdp/current/zeppelin-server/conf/zeppelin-site.xml from 0 to zeppelin
> 2016-06-16 11:32:11,785 - 
> File['/usr/hdp/current/zeppelin-server/conf/zeppelin-env.sh'] {'owner': 
> 'zeppelin', 'content': InlineTemplate(...), 'group': 'zeppelin'}
> 2016-06-16 11:32:11,785 - Writing 
> File['/usr/hdp/current/zeppelin-server/conf/zeppelin-env.sh'] because 
> contents don't match
> 2016-06-16 11:32:11,787 - 
> File['/usr/hdp/current/zeppelin-server/conf/shiro.ini'] {'owner': 'zeppelin', 
> 'content': InlineTemplate(...), 'group': 'zeppelin'}
> 2016-06-16 11:32:11,788 - Writing 
> File['/usr/hdp/current/zeppelin-server/conf/shiro.ini'] because contents 
> don't match
> 2016-06-16 11:32:11,788 - 
> File['/usr/hdp/current/zeppelin-server/conf/log4j.properties'] {'owner': 
> 'zeppelin', 'content': ..., 'group': 'zeppelin'}
> 2016-06-16 11:32:11,788 - Writing 
> File['/usr/hdp/current/zeppelin-server/conf/log4j.properties'] because 
> contents don't match
> 2016-06-16 11:32:11,789 - 
> Execute['/var/lib/ambari-agent/cache/common-services/ZEPPELIN/0.6.0.2.5/package/scripts/setup_snapshot.sh
>  /usr/hdp/current/zeppelin-server c6401.ambari.apache.org 9083 10001 
> c6401.ambari.apache.org 9995 True 
> /var/lib/ambari-agent/cache/common-services/ZEPPELIN/0.6.0.2.5/package 
> /usr/jdk64/jdk1.8.0_66 >> /var/log/zeppelin/zeppelin-setup.log'] {'user': 
> 'zeppelin'}
> Command failed after 1 tries
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to