[ 
https://issues.apache.org/jira/browse/AMBARI-8076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14192437#comment-14192437
 ] 

Hudson commented on AMBARI-8076:
--------------------------------

ABORTED: Integrated in Ambari-branch-1.7.0-docker #46 (See 
[https://builds.apache.org/job/Ambari-branch-1.7.0-docker/46/])
AMBARI-8076. Install on a 5 node cluster fails with link creation for 
libsnappy. (aonishuk) (aonishuk: 
http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=e5c08c5b5a35c528f44c25b863acaa3fe28bfc44)
* 
ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py
* 
ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/params.py
* 
ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py
* 
ambari-server/src/test/python/stacks/1.3.2/hooks/before-START/test_before_start.py
* 
ambari-server/src/main/resources/stacks/HDP/1.3.2/hooks/before-START/scripts/shared_initialization.py
* 
ambari-server/src/main/resources/stacks/HDP/1.3.2/hooks/before-START/scripts/params.py


> Install on a 5 node cluster fails with link creation for libsnappy.
> -------------------------------------------------------------------
>
>                 Key: AMBARI-8076
>                 URL: https://issues.apache.org/jira/browse/AMBARI-8076
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Andrew Onischuk
>            Assignee: Andrew Onischuk
>             Fix For: 1.7.0
>
>
> Install on a 5 node cluster fails with link creation for libsnappy.
>     
>     
>     
>     2014-10-29 22:38:09,249 - Error while executing command 'start':
>     Traceback (most recent call last):
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 122, in execute
>         method(env)
>       File 
> "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py",
>  line 32, in hook
>         setup_hadoop()
>       File 
> "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py",
>  line 34, in setup_hadoop
>         install_snappy()
>       File 
> "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py",
>  line 168, in install_snappy
>         format("mkdir -p {so_target_dir_x86}; ln -sf {so_src_x86} 
> {so_target_x86}"))
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 
> 148, in __init__
>         self.env.run()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 149, in run
>         self.run_action(resource, action)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 115, in run_action
>         provider_action()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 241, in action_run
>         raise ex
>     Fail: Execution of 'mkdir -p 
> /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf 
> /usr/hdp/current/hadoop-client/lib/libsnappy.so 
> /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so' 
> returned 1. mkdir: cannot create directory `/usr/hdp/current/hadoop-client': 
> File exists
>     ln: creating symbolic link 
> `/usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so': No 
> such file or directory
>     stdout:   /var/lib/ambari-agent/data/output-78.txt
>     
>     2014-10-29 22:38:08,708 - Execute['mkdir -p 
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 
> 10     
> http://pt170-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip
>  -o 
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] 
> {'environment': ..., 'not_if': 'test -e 
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 
> 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
>     2014-10-29 22:38:08,725 - Skipping Execute['mkdir -p 
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 
> 10     
> http://pt170-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip
>  -o 
> /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] 
> due to not_if
>     2014-10-29 22:38:08,726 - Group['hadoop'] {'ignore_failures': False}
>     2014-10-29 22:38:08,727 - Modifying group hadoop
>     2014-10-29 22:38:08,788 - Group['nobody'] {'ignore_failures': False}
>     2014-10-29 22:38:08,788 - Modifying group nobody
>     2014-10-29 22:38:08,824 - Group['users'] {'ignore_failures': False}
>     2014-10-29 22:38:08,825 - Modifying group users
>     2014-10-29 22:38:08,858 - Group['nagios'] {'ignore_failures': False}
>     2014-10-29 22:38:08,858 - Modifying group nagios
>     2014-10-29 22:38:08,891 - Group['knox'] {'ignore_failures': False}
>     2014-10-29 22:38:08,892 - Modifying group knox
>     2014-10-29 22:38:08,916 - User['nobody'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'nobody']}
>     2014-10-29 22:38:08,916 - Modifying user nobody
>     2014-10-29 22:38:08,929 - User['hive'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:08,930 - Modifying user hive
>     2014-10-29 22:38:08,942 - User['oozie'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'users']}
>     2014-10-29 22:38:08,942 - Modifying user oozie
>     2014-10-29 22:38:08,955 - User['nagios'] {'gid': 'nagios', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:08,955 - Modifying user nagios
>     2014-10-29 22:38:08,968 - User['ambari-qa'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'users']}
>     2014-10-29 22:38:08,968 - Modifying user ambari-qa
>     2014-10-29 22:38:08,981 - User['flume'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:08,981 - Modifying user flume
>     2014-10-29 22:38:08,993 - User['hdfs'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:08,994 - Modifying user hdfs
>     2014-10-29 22:38:09,006 - User['knox'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:09,006 - Modifying user knox
>     2014-10-29 22:38:09,019 - User['storm'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:09,019 - Modifying user storm
>     2014-10-29 22:38:09,031 - User['mapred'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:09,032 - Modifying user mapred
>     2014-10-29 22:38:09,044 - User['hbase'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:09,044 - Modifying user hbase
>     2014-10-29 22:38:09,057 - User['tez'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'users']}
>     2014-10-29 22:38:09,057 - Modifying user tez
>     2014-10-29 22:38:09,070 - User['zookeeper'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:09,070 - Modifying user zookeeper
>     2014-10-29 22:38:09,082 - User['kafka'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:09,083 - Modifying user kafka
>     2014-10-29 22:38:09,095 - User['falcon'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:09,095 - Modifying user falcon
>     2014-10-29 22:38:09,108 - User['sqoop'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:09,108 - Modifying user sqoop
>     2014-10-29 22:38:09,121 - User['yarn'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:09,122 - Modifying user yarn
>     2014-10-29 22:38:09,134 - User['hcat'] {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2014-10-29 22:38:09,135 - Modifying user hcat
>     2014-10-29 22:38:09,147 - 
> File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': 
> StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>     2014-10-29 22:38:09,149 - 
> Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa 
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
>  2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
>     2014-10-29 22:38:09,160 - Skipping 
> Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa 
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
>  2>/dev/null'] due to not_if
>     2014-10-29 22:38:09,161 - 
> File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': 
> StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>     2014-10-29 22:38:09,162 - 
> Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase 
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 
> 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'}
>     2014-10-29 22:38:09,173 - Skipping 
> Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase 
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 
> 2>/dev/null'] due to not_if
>     2014-10-29 22:38:09,174 - Directory['/etc/hadoop/conf.empty'] {'owner': 
> 'root', 'group': 'root', 'recursive': True}
>     2014-10-29 22:38:09,174 - Link['/etc/hadoop/conf'] {'not_if': 'ls 
> /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
>     2014-10-29 22:38:09,185 - Skipping Link['/etc/hadoop/conf'] due to not_if
>     2014-10-29 22:38:09,197 - File['/etc/hadoop/conf/hadoop-env.sh'] 
> {'content': InlineTemplate(...), 'owner': 'hdfs'}
>     2014-10-29 22:38:09,207 - Execute['/bin/echo 0 > /selinux/enforce'] 
> {'only_if': 'test -f /selinux/enforce'}
>     2014-10-29 22:38:09,235 - Execute['mkdir -p 
> /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf 
> /usr/hdp/current/hadoop-client/lib/libsnappy.so 
> /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so'] {}
>     2014-10-29 22:38:09,249 - Error while executing command 'start':
>     Traceback (most recent call last):
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 122, in execute
>         method(env)
>       File 
> "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py",
>  line 32, in hook
>         setup_hadoop()
>       File 
> "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py",
>  line 34, in setup_hadoop
>         install_snappy()
>       File 
> "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py",
>  line 168, in install_snappy
>         format("mkdir -p {so_target_dir_x86}; ln -sf {so_src_x86} 
> {so_target_x86}"))
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 
> 148, in __init__
>         self.env.run()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 149, in run
>         self.run_action(resource, action)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 115, in run_action
>         provider_action()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 241, in action_run
>         raise ex
>     Fail: Execution of 'mkdir -p 
> /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf 
> /usr/hdp/current/hadoop-client/lib/libsnappy.so 
> /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so' 
> returned 1. mkdir: cannot create directory `/usr/hdp/current/hadoop-client': 
> File exists
>     ln: creating symbolic link 
> `/usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so': No 
> such file or directory
>     
>     
> Looks like this happens if there is no hadoop related libary installed on the
> host.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to