#IMHO you may actually refer to Hortonworks' instruction manual for
installation / upgrade to see if things are fine... might help

regards
Devopam

On Wed, Feb 18, 2015 at 9:21 PM, Joshi Omkar <[email protected]> wrote:

>  Using Ambari 1.7, I’m trying to install the HDP 2.2 stack on 9 nodes.
>
>
>
> The HDP.repo is :
>
>
>
> [HDP-2.2]
>
> name=HDP
>
> baseurl=http://l1032lab.sss.se.scania.com/hdp/HDP/centos6/2.x/GA/2.2.0.0/
>
> path=/
>
> enabled=1
>
> gpgcheck=0
>
>
>
>
>
> The ambari.conf file is :
>
>
>
> [ambari-1.x]
>
> name=Ambari 1.x
>
> baseurl=http://public-repo-1.hortonworks.com/ambari/centos6/1.x/GA
>
> gpgcheck=1
>
> gpgkey=
> http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
>
> enabled=0
>
> priority=1
>
>
>
> [Updates-ambari-1.7.0]
>
> name=ambari-1.7.0 - Updates
>
> baseurl=
> http://l1032lab.sss.se.scania.com/ambari/centos6/1.x/updates/1.7.0/
>
> gpgcheck=0
>
> gpgkey=
> http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
>
> enabled=1
>
> priority=1
>
>
>
> I get several warnings and one error for the App Timeline Server whose log
> is given below but the rpm
> (hadoop/hadoop_2_2_0_0_2041-yarn-2.6.0.2.2.0.0-2041.el6.x86_64.rpm) exists
> and I’m able to see it even in the browser :
>
>
>
> stderr:
>
> 2015-02-18 16:28:28,134 - Error while executing command 'install':
>
> Traceback (most recent call last):
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
> line 123, in execute
>
>     method(env)
>
>   File
> "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/YARN/package/scripts/application_timeline_server.py",
> line 30, in install
>
>     self.install_packages(env)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
> line 188, in install_packages
>
>     Package(name)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line
> 148, in __init__
>
>     self.env.run()
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 149, in run
>
>     self.run_action(resource, action)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 115, in run_action
>
>     provider_action()
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py",
> line 40, in action_install
>
>     self.install_package(package_name)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py",
> line 36, in install_package
>
>     shell.checked_call(cmd)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
> 36, in checked_call
>
>     return _call(command, logoutput, True, cwd, env, preexec_fn, user,
> wait_for_finish, timeout, path)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
> 102, in _call
>
>     raise Fail(err_msg)
>
> Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_2_*-yarn'
> returned 1. Error Downloading Packages:
>
>   hadoop_2_2_0_0_2041-yarn-2.6.0.2.2.0.0-2041.el6.x86_64: failure:
> hadoop/hadoop_2_2_0_0_2041-yarn-2.6.0.2.2.0.0-2041.el6.x86_64.rpm from
> HDP-2.2: [Errno 256] No more mirrors to try.
>
> stdout:
>
> 2015-02-18 16:27:46,888 - Execute['mkdir -p
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x ""
> --retry 10
> http://l1032lab.sss.se.scania.com:8080/resources//UnlimitedJCEPolicyJDK7.zip
> -o
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip']
> {'environment': ..., 'not_if': 'test -e
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip',
> 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
>
> 2015-02-18 16:27:46,928 - Skipping Execute['mkdir -p
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x ""
> --retry 10
> http://l1032lab.sss.se.scania.com:8080/resources//UnlimitedJCEPolicyJDK7.zip
> -o
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip']
> due to not_if
>
> 2015-02-18 16:27:46,929 - Group['hadoop'] {'ignore_failures': True}
>
> 2015-02-18 16:27:46,933 - Modifying group hadoop
>
> 2015-02-18 16:27:46,985 - Skipping failure of Group['hadoop'] due to
> ignore_failures. Failure reason: Execution of 'groupmod hadoop' returned
> 10. groupmod: group 'hadoop' does not exist in /etc/group
>
> 2015-02-18 16:27:46,985 - Group['nobody'] {'ignore_failures': True}
>
> 2015-02-18 16:27:46,985 - Modifying group nobody
>
> 2015-02-18 16:27:47,035 - Group['users'] {'ignore_failures': True}
>
> 2015-02-18 16:27:47,035 - Modifying group users
>
> 2015-02-18 16:27:47,091 - Group['nagios'] {'ignore_failures': True}
>
> 2015-02-18 16:27:47,091 - Modifying group nagios
>
> 2015-02-18 16:27:47,141 - User['hive'] {'gid': 'hadoop',
> 'ignore_failures': True, 'groups': [u'hadoop']}
>
> 2015-02-18 16:27:47,141 - Modifying user hive
>
> 2015-02-18 16:27:47,188 - User['nobody'] {'gid': 'hadoop',
> 'ignore_failures': True, 'groups': [u'nobody']}
>
> 2015-02-18 16:27:47,188 - Modifying user nobody
>
> 2015-02-18 16:27:47,235 - User['nagios'] {'gid': 'nagios',
> 'ignore_failures': True, 'groups': [u'hadoop']}
>
> 2015-02-18 16:27:47,235 - Modifying user nagios
>
> 2015-02-18 16:27:47,283 - User['ambari-qa'] {'gid': 'hadoop',
> 'ignore_failures': True, 'groups': [u'users']}
>
> 2015-02-18 16:27:47,284 - Modifying user ambari-qa
>
> 2015-02-18 16:27:47,332 - User['flume'] {'gid': 'hadoop',
> 'ignore_failures': True, 'groups': [u'hadoop']}
>
> 2015-02-18 16:27:47,333 - Modifying user flume
>
> 2015-02-18 16:27:47,378 - User['hdfs'] {'gid': 'hadoop',
> 'ignore_failures': True, 'groups': [u'hadoop']}
>
> 2015-02-18 16:27:47,379 - Modifying user hdfs
>
> 2015-02-18 16:27:47,425 - User['mapred'] {'gid': 'hadoop',
> 'ignore_failures': True, 'groups': [u'hadoop']}
>
> 2015-02-18 16:27:47,425 - Modifying user mapred
>
> 2015-02-18 16:27:47,473 - User['hbase'] {'gid': 'hadoop',
> 'ignore_failures': True, 'groups': [u'hadoop']}
>
> 2015-02-18 16:27:47,474 - Modifying user hbase
>
> 2015-02-18 16:27:47,522 - User['tez'] {'gid': 'hadoop', 'ignore_failures':
> True, 'groups': [u'users']}
>
> 2015-02-18 16:27:47,523 - Modifying user tez
>
> 2015-02-18 16:27:47,567 - User['zookeeper'] {'gid': 'hadoop',
> 'ignore_failures': True, 'groups': [u'hadoop']}
>
> 2015-02-18 16:27:47,567 - Modifying user zookeeper
>
> 2015-02-18 16:27:47,614 - User['sqoop'] {'gid': 'hadoop',
> 'ignore_failures': True, 'groups': [u'hadoop']}
>
> 2015-02-18 16:27:47,615 - Modifying user sqoop
>
> 2015-02-18 16:27:47,663 - User['yarn'] {'gid': 'hadoop',
> 'ignore_failures': True, 'groups': [u'hadoop']}
>
> 2015-02-18 16:27:47,664 - Modifying user yarn
>
> 2015-02-18 16:27:47,712 - User['hcat'] {'gid': 'hadoop',
> 'ignore_failures': True, 'groups': [u'hadoop']}
>
> 2015-02-18 16:27:47,713 - Modifying user hcat
>
> 2015-02-18 16:27:47,758 -
> File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content':
> StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>
> 2015-02-18 16:27:47,760 -
> Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
> 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
>
> 2015-02-18 16:27:47,803 - Skipping
> Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
> 2>/dev/null'] due to not_if
>
> 2015-02-18 16:27:47,804 -
> File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content':
> StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>
> 2015-02-18 16:27:47,805 -
> Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/nsr/hadoop/hbase
> 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'}
>
> 2015-02-18 16:27:47,850 - Skipping
> Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/nsr/hadoop/hbase
> 2>/dev/null'] due to not_if
>
> 2015-02-18 16:27:47,850 - Directory['/etc/hadoop/conf.empty'] {'owner':
> 'root', 'group': 'root', 'recursive': True}
>
> 2015-02-18 16:27:47,851 - Link['/etc/hadoop/conf'] {'not_if': 'ls
> /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
>
> 2015-02-18 16:27:47,892 - Skipping Link['/etc/hadoop/conf'] due to not_if
>
> 2015-02-18 16:27:47,915 - File['/etc/hadoop/conf/hadoop-env.sh']
> {'content': InlineTemplate(...), 'owner': 'hdfs'}
>
> 2015-02-18 16:27:47,929 - Repository['HDP-2.2'] {'base_url': '
> http://l1032lab.sss.se.scania.com/hdp/HDP/centos6/2.x/GA/2.2.0.0/',
> 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template':
> 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None}
>
> 2015-02-18 16:27:47,941 - File['/etc/yum.repos.d/HDP.repo'] {'content':
> Template('repo_suse_rhel.j2')}
>
> 2015-02-18 16:27:47,942 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': '
> http://l1032lab.sss.se.scania.com/hdp/HDP-UTILS-1.1.0.20/repos/centos6/',
> 'action': ['create'], 'components': [u'HDP-UTILS', 'main'],
> 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS',
> 'mirror_list': None}
>
> 2015-02-18 16:27:47,946 - File['/etc/yum.repos.d/HDP-UTILS.repo']
> {'content': Template('repo_suse_rhel.j2')}
>
> 2015-02-18 16:27:47,946 - Package['unzip'] {}
>
> 2015-02-18 16:27:48,485 - Skipping installing existent package unzip
>
> 2015-02-18 16:27:48,485 - Package['curl'] {}
>
> 2015-02-18 16:27:49,019 - Skipping installing existent package curl
>
> 2015-02-18 16:27:49,020 - Package['hdp-select'] {}
>
> 2015-02-18 16:27:49,553 - Skipping installing existent package hdp-select
>
> 2015-02-18 16:27:49,554 - Execute['mkdir -p
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ;   curl -kf -x ""
> --retry 10
> http://l1032lab.sss.se.scania.com:8080/resources//jdk-7u67-linux-x64.tar.gz
> -o
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz']
> {'environment': ..., 'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java',
> 'path': ['/bin', '/usr/bin/']}
>
> 2015-02-18 16:27:49,588 - Skipping Execute['mkdir -p
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ;   curl -kf -x ""
> --retry 10
> http://l1032lab.sss.se.scania.com:8080/resources//jdk-7u67-linux-x64.tar.gz
> -o
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz']
> due to not_if
>
> 2015-02-18 16:27:49,589 - Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ;
> tar -xf
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz
> > /dev/null 2>&1'] {'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java',
> 'path': ['/bin', '/usr/bin/']}
>
> 2015-02-18 16:27:49,621 - Skipping Execute['mkdir -p /usr/jdk64 ; cd
> /usr/jdk64 ; tar -xf
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz
> > /dev/null 2>&1'] due to not_if
>
> 2015-02-18 16:27:49,774 - Package['hadoop_2_2_*-yarn'] {}
>
> 2015-02-18 16:27:50,307 - Installing package hadoop_2_2_*-yarn
> ('/usr/bin/yum -d 0 -e 0 -y install hadoop_2_2_*-yarn')
>
> 2015-02-18 16:28:28,134 - Error while executing command 'install':
>
> Traceback (most recent call last):
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
> line 123, in execute
>
>     method(env)
>
>   File
> "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/YARN/package/scripts/application_timeline_server.py",
> line 30, in install
>
>     self.install_packages(env)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
> line 188, in install_packages
>
>     Package(name)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line
> 148, in __init__
>
>     self.env.run()
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 149, in run
>
>     self.run_action(resource, action)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
> line 115, in run_action
>
>     provider_action()
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py",
> line 40, in action_install
>
>     self.install_package(package_name)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py",
> line 36, in install_package
>
>     shell.checked_call(cmd)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
> 36, in checked_call
>
>     return _call(command, logoutput, True, cwd, env, preexec_fn, user,
> wait_for_finish, timeout, path)
>
>   File
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line
> 102, in _call
>
>     raise Fail(err_msg)
>
> Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_2_*-yarn'
> returned 1. Error Downloading Packages:
>
>   hadoop_2_2_0_0_2041-yarn-2.6.0.2.2.0.0-2041.el6.x86_64: failure:
> hadoop/hadoop_2_2_0_0_2041-yarn-2.6.0.2.2.0.0-2041.el6.x86_64.rpm from
> HDP-2.2: [Errno 256] No more mirrors to try.
>



-- 
Devopam Mittra
Life and Relations are not binary

Reply via email to