----------------------------------------------------------- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/42094/#review113572 -----------------------------------------------------------
Ship it! Ship It! - Sumit Mohanty On Jan. 9, 2016, 12:01 a.m., Alejandro Fernandez wrote: > > ----------------------------------------------------------- > This is an automatically generated e-mail. To reply, visit: > https://reviews.apache.org/r/42094/ > ----------------------------------------------------------- > > (Updated Jan. 9, 2016, 12:01 a.m.) > > > Review request for Ambari, Andrew Onischuk, Jonathan Hurley, Jayush Luniya, > Nate Cole, Sumit Mohanty, Srimanth Gunturi, and Sid Wagle. > > > Bugs: AMBARI-14596 > https://issues.apache.org/jira/browse/AMBARI-14596 > > > Repository: ambari > > > Description > ------- > > Cluster installation failed on Accumulo Client because it was one of the > first tasks scheduled and HDFS Client had not been installed yet, which > installs the hadoop rpm and creates the /etc/hadoop/conf folder. > > If a host does not contain /etc/hadoop/conf, then we should not attempt to > write config files to it during the after-install hooks. Once a component is > installed that does contain the hadoop rpm, then it will be responsible for > writing out the configs to it. > > Ambari 2.2.1.0-71 > HDP 2.4.0.0-47 > > ``` > Traceback (most recent call last): > File > "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", > line 38, in <module> > AfterInstallHook().execute() > File > "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", > line 219, in execute > method(env) > File > "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", > line 33, in hook > setup_config() > File > "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py", > line 55, in setup_config > only_if=format("ls {hadoop_conf_dir}")) > File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", > line 154, in __init__ > self.env.run() > File > "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", > line 158, in run > self.run_action(resource, action) > File > "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", > line 121, in run_action > provider_action() > File > "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", > line 67, in action_create > encoding = self.resource.encoding > File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", > line 154, in __init__ > self.env.run() > File > "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", > line 158, in run > self.run_action(resource, action) > File > "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", > line 121, in run_action > provider_action() > File > "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", > line 87, in action_create > raise Fail("Applying %s failed, parent directory %s doesn't exist" % > (self.resource, dirname)) > resource_management.core.exceptions.Fail: Applying > File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] failed, parent > directory /usr/hdp/current/hadoop-client/conf doesn't exist > ``` > > > Diffs > ----- > > > ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py > 98b7eb5 > > Diff: https://reviews.apache.org/r/42094/diff/ > > > Testing > ------- > > Verified on a new cluster installation. > > Python unit tests passed. > OK > ---------------------------------------------------------------------- > Total run:858 > Total errors:0 > Total failures:0 > OK > > > Thanks, > > Alejandro Fernandez > >
