Here are some pointers that may help you: 1. Ambari : Check the configurations for hbase. Revert to the last working version if it has been changed recently. 2. Without assuming, you may want to revalidate connectivity to all nn and dn s 3. Please check if someone has moved the files in filesystem manually.
Hope one of them leads to the actual problem ! regards Dev On Fri, Oct 30, 2015 at 1:24 PM, Jeetendra G <[email protected]> wrote: > Hi All, > > I am using Ambari version 2.1 and I am not able to stop and start services > now, every service is in red and giving me below error Can somebody guide > me how to resolve this issue? > > Traceback (most recent call last): > File > "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", > line 38, in <module> > AfterInstallHook().execute() > File > "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", > line 218, in execute > method(env) > File > "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", > line 35, in hook > link_configs(self.stroutfile) > File > "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py", > line 91, in link_configs > _link_configs(k, json_version, v) > File > "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py", > line 156, in _link_configs > conf_select.select("HDP", package, version) > File > "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/conf_select.py", > line 241, in select > shell.checked_call(get_cmd("set-conf-dir", package, version), > logoutput=False, quiet=False, sudo=True) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", > line 70, in inner > result = function(command, **kwargs) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", > line 92, in checked_call > tries=tries, try_sleep=try_sleep) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", > line 140, in _call_wrapper > result = _call(command, **kwargs_copy) > File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", > line 291, in _call > raise Fail(err_msg) > resource_management.core.exceptions.Fail: Execution of 'conf-select > set-conf-dir --package hbase --stack-version 2.3.0.0-2557 --conf-version 0' > returned 1. Traceback (most recent call last): > File "/usr/bin/conf-select", line 182, in <module> > setConfDir(options.pname, options.sver, options.cver) > File "/usr/bin/conf-select", line 138, in setConfDir > check(sver, pname, cver, "set") > File "/usr/bin/conf-select", line 100, in check > chksVer(sver) > File "/usr/bin/conf-select", line 78, in chksVer > result[tuple(map(int, versionRegex.split(f)))] = f > ValueError: invalid literal for int() with base 10: 'binning' > > stdout: /var/lib/ambari-agent/data/output-4760.txt > > 2015-10-30 13:21:33,166 - > Directory['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/'] {'recursive': > True} > 2015-10-30 13:21:33,169 - > File['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jce_policy-8.zip'] > {'content': > DownloadSource('http://ambari.housing.com:8080/resources//jce_policy-8.zip')} > 2015-10-30 13:21:33,169 - Not downloading the file from > http://ambari.housing.com:8080/resources//jce_policy-8.zip, because > /var/lib/ambari-agent/data/tmp/jce_policy-8.zip already exists > 2015-10-30 13:21:33,170 - Group['spark'] {'ignore_failures': False} > 2015-10-30 13:21:33,171 - Group['hadoop'] {'ignore_failures': False} > 2015-10-30 13:21:33,171 - Group['users'] {'ignore_failures': False} > 2015-10-30 13:21:33,172 - User['hive'] {'gid': 'hadoop', 'ignore_failures': > False, 'groups': [u'hadoop']} > 2015-10-30 13:21:33,173 - User['zookeeper'] {'gid': 'hadoop', > 'ignore_failures': False, 'groups': [u'hadoop']} > 2015-10-30 13:21:33,174 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': > False, 'groups': [u'users']} > 2015-10-30 13:21:33,175 - User['ams'] {'gid': 'hadoop', 'ignore_failures': > False, 'groups': [u'hadoop']} > 2015-10-30 13:21:33,176 - User['tez'] {'gid': 'hadoop', 'ignore_failures': > False, 'groups': [u'users']} > 2015-10-30 13:21:33,176 - User['spark'] {'gid': 'hadoop', 'ignore_failures': > False, 'groups': [u'hadoop']} > 2015-10-30 13:21:33,177 - User['ambari-qa'] {'gid': 'hadoop', > 'ignore_failures': False, 'groups': [u'users']} > 2015-10-30 13:21:33,178 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': > False, 'groups': [u'hadoop']} > 2015-10-30 13:21:33,179 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': > False, 'groups': [u'hadoop']} > 2015-10-30 13:21:33,179 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': > False, 'groups': [u'hadoop']} > 2015-10-30 13:21:33,180 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': > False, 'groups': [u'hadoop']} > 2015-10-30 13:21:33,181 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': > False, 'groups': [u'hadoop']} > 2015-10-30 13:21:33,181 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': > False, 'groups': [u'hadoop']} > 2015-10-30 13:21:33,182 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] > {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} > 2015-10-30 13:21:33,183 - > Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa > /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] > {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} > 2015-10-30 13:21:33,219 - Skipping > Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa > /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] > due to not_if > 2015-10-30 13:21:33,220 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', > 'recursive': True, 'mode': 0775, 'cd_access': 'a'} > 2015-10-30 13:21:33,221 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] > {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} > 2015-10-30 13:21:33,222 - > Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase > /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] > {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} > 2015-10-30 13:21:33,266 - Skipping > Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase > /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due > to not_if > 2015-10-30 13:21:33,267 - Group['hdfs'] {'ignore_failures': False} > 2015-10-30 13:21:33,267 - User['hdfs'] {'ignore_failures': False, 'groups': > [u'hadoop', u'hdfs']} > 2015-10-30 13:21:33,268 - Directory['/etc/hadoop'] {'mode': 0755} > 2015-10-30 13:21:33,286 - > File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': > InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} > 2015-10-30 13:21:33,303 - Repository['HDP-2.3'] {'base_url': > 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.3.0.0', > 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': > '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list > %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif > %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': > None} > 2015-10-30 13:21:33,314 - File['/etc/yum.repos.d/HDP.repo'] {'content': > InlineTemplate(...)} > 2015-10-30 13:21:33,316 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': > 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7', > 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': > '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list > %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif > %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', > 'mirror_list': None} > 2015-10-30 13:21:33,319 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': > InlineTemplate(...)} > 2015-10-30 13:21:33,320 - Package['unzip'] {} > 2015-10-30 13:21:33,449 - Skipping installation of existing package unzip > 2015-10-30 13:21:33,449 - Package['curl'] {} > 2015-10-30 13:21:33,460 - Skipping installation of existing package curl > 2015-10-30 13:21:33,460 - Package['hdp-select'] {} > 2015-10-30 13:21:33,472 - Skipping installation of existing package hdp-select > 2015-10-30 13:21:33,473 - > Directory['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/'] {'recursive': > True} > 2015-10-30 13:21:33,473 - > File['/var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'] {'content': > DownloadSource('http://ambari.housing.com:8080/resources//jdk-8u40-linux-x64.tar.gz'), > 'not_if': 'test -f /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'} > 2015-10-30 13:21:33,507 - Skipping > File['/var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'] due to not_if > 2015-10-30 13:21:33,508 - Directory['/usr/jdk64'] {} > 2015-10-30 13:21:33,508 - Execute['('chmod', 'a+x', u'/usr/jdk64')'] > {'not_if': 'test -e /usr/jdk64/jdk1.8.0_40/bin/java', 'sudo': True} > 2015-10-30 13:21:33,543 - Skipping Execute['('chmod', 'a+x', u'/usr/jdk64')'] > due to not_if > 2015-10-30 13:21:33,544 - Execute['mkdir -p > /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk > && tar -xf /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz && > ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64'] > {'not_if': 'test -e /usr/jdk64/jdk1.8.0_40/bin/java'} > 2015-10-30 13:21:33,578 - Skipping Execute['mkdir -p > /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk > && tar -xf /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz && > ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64'] due > to not_if > 2015-10-30 13:21:33,579 - File['/usr/jdk64/jdk1.8.0_40/bin/java'] {'mode': > 0755, 'cd_access': 'a'} > 2015-10-30 13:21:33,580 - Execute['('chgrp', '-R', u'hadoop', > u'/usr/jdk64/jdk1.8.0_40')'] {'sudo': True} > 2015-10-30 13:21:33,630 - Execute['('chown', '-R', 'root', > u'/usr/jdk64/jdk1.8.0_40')'] {'sudo': True} > 2015-10-30 13:21:34,142 - Package['hive_2_3_*'] {} > 2015-10-30 13:21:34,251 - Skipping installation of existing package hive_2_3_* > 2015-10-30 13:21:34,251 - Package['hive_2_3_*-hcatalog'] {} > 2015-10-30 13:21:34,264 - Skipping installation of existing package > hive_2_3_*-hcatalog > 2015-10-30 13:21:34,265 - Package['hive_2_3_*-webhcat'] {} > 2015-10-30 13:21:34,283 - Skipping installation of existing package > hive_2_3_*-webhcat > 2015-10-30 13:21:34,286 - Directory['/usr/hdp/current/hive-client/conf'] > {'owner': 'hcat', 'group': 'hadoop', 'recursive': True} > 2015-10-30 13:21:34,288 - Changing owner for > /usr/hdp/current/hive-client/conf from 1000 to hcat > 2015-10-30 13:21:34,288 - > Directory['/usr/hdp/current/hive-webhcat/etc/hcatalog'] {'owner': 'hcat', > 'group': 'hadoop', 'recursive': True} > 2015-10-30 13:21:34,288 - Directory['/var/run/webhcat'] {'owner': 'hcat', > 'recursive': True} > 2015-10-30 13:21:34,289 - XmlConfig['hive-site.xml'] {'group': 'hadoop', > 'conf_dir': '/usr/hdp/current/hive-client/conf', 'mode': 0644, > 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...} > 2015-10-30 13:21:34,304 - Generating config: > /usr/hdp/current/hive-client/conf/hive-site.xml > 2015-10-30 13:21:34,305 - > File['/usr/hdp/current/hive-client/conf/hive-site.xml'] {'owner': 'hive', > 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': > 'UTF-8'} > 2015-10-30 13:21:34,453 - Writing > File['/usr/hdp/current/hive-client/conf/hive-site.xml'] because contents > don't match > 2015-10-30 13:21:34,456 - > File['/usr/hdp/current/hive-webhcat/etc/hcatalog/hcat-env.sh'] {'content': > InlineTemplate(...), 'owner': 'hcat', 'group': 'hadoop'} > 2015-10-30 13:21:34,841 - Execute['ambari-sudo.sh -H -E touch > /var/lib/ambari-agent/data/hdp-select-set-all.performed ; ambari-sudo.sh > /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions > | grep ^2.3 | tail -1`'] {'not_if': 'test -f > /var/lib/ambari-agent/data/hdp-select-set-all.performed', 'only_if': 'ls -d > /usr/hdp/2.3*'} > 2015-10-30 13:21:34,875 - Skipping Execute['ambari-sudo.sh -H -E touch > /var/lib/ambari-agent/data/hdp-select-set-all.performed ; ambari-sudo.sh > /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions > | grep ^2.3 | tail -1`'] due to not_if > 2015-10-30 13:21:34,876 - XmlConfig['core-site.xml'] {'group': 'hadoop', > 'conf_dir': '/usr/hdp/current/hadoop-client/conf', > 'configuration_attributes': {u'final': {u'fs.defaultFS': u'true'}}, 'owner': > 'hdfs', 'only_if': 'ls /usr/hdp/current/hadoop-client/conf', > 'configurations': ...} > 2015-10-30 13:21:34,928 - Generating config: > /usr/hdp/current/hadoop-client/conf/core-site.xml > 2015-10-30 13:21:34,928 - > File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] {'owner': 'hdfs', > 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': > 'UTF-8'} > 2015-10-30 13:21:34,953 - Writing > File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] because contents > don't match > 2015-10-30 13:21:34,956 - Execute['('cp', '-R', '-p', '/etc/hbase/conf', > '/etc/hbase/conf.install')'] {'not_if': 'test -e /etc/hbase/conf.install', > 'sudo': True} > 2015-10-30 13:21:34,988 - Skipping Execute['('cp', '-R', '-p', > '/etc/hbase/conf', '/etc/hbase/conf.install')'] due to not_if > 2015-10-30 13:21:35,060 - New conf directories: > 2015-10-30 13:21:35,136 - checked_call['conf-select set-conf-dir --package > hbase --stack-version 2.3.0.0-2557 --conf-version 0'] {'logoutput': False, > 'sudo': True, 'quiet': False} > > -- Devopam Mittra Life and Relations are not binary
