> On Jan. 10, 2016, 8:29 a.m., Jonathan Hurley wrote:
> > ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py,
> >  line 58
> > <https://reviews.apache.org/r/42094/diff/1/?file=1188905#file1188905line58>
> >
> >     This is a very specific if-statement and doesn't keep this file 
> > generic. 
> >     
> >     - Why do we care if namenode is present or not? If the directory 
> > exists, just write the configs
> >     - Why do we care about GlusterFS or HCFS here? As long as the directory 
> > exists, why not write out core-site.xml ?
> 
> Alejandro Fernandez wrote:
>     This is existing code that someone else wrote, and I would presume for 
> good reason. I don't want to muck with it.

<sigh/> I guess that's why I'm a jerk on reviews for people to add comments. 
Now we have to live with this weird if-statement forever because we're scared 
to remove it.


- Jonathan


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/42094/#review113647
-----------------------------------------------------------


On Jan. 8, 2016, 7:01 p.m., Alejandro Fernandez wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/42094/
> -----------------------------------------------------------
> 
> (Updated Jan. 8, 2016, 7:01 p.m.)
> 
> 
> Review request for Ambari, Andrew Onischuk, Jonathan Hurley, Jayush Luniya, 
> Nate Cole, Sumit Mohanty, Srimanth Gunturi, and Sid Wagle.
> 
> 
> Bugs: AMBARI-14596
>     https://issues.apache.org/jira/browse/AMBARI-14596
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> Cluster installation failed on Accumulo Client because it was one of the 
> first tasks scheduled and HDFS Client had not been installed yet, which 
> installs the hadoop rpm and creates the /etc/hadoop/conf folder.
> 
> If a host does not contain /etc/hadoop/conf, then we should not attempt to 
> write config files to it during the after-install hooks. Once a component is 
> installed that does contain the hadoop rpm, then it will be responsible for 
> writing out the configs to it.
> 
> Ambari 2.2.1.0-71
> HDP 2.4.0.0-47
> 
> ```
> Traceback (most recent call last):
>   File 
> "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
>  line 38, in <module>
>     AfterInstallHook().execute()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 219, in execute
>     method(env)
>   File 
> "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
>  line 33, in hook
>     setup_config()
>   File 
> "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py",
>  line 55, in setup_config
>     only_if=format("ls {hadoop_conf_dir}"))
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
> line 154, in __init__
>     self.env.run()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 158, in run
>     self.run_action(resource, action)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 121, in run_action
>     provider_action()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py",
>  line 67, in action_create
>     encoding = self.resource.encoding
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
> line 154, in __init__
>     self.env.run()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 158, in run
>     self.run_action(resource, action)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 121, in run_action
>     provider_action()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 87, in action_create
>     raise Fail("Applying %s failed, parent directory %s doesn't exist" % 
> (self.resource, dirname))
> resource_management.core.exceptions.Fail: Applying 
> File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] failed, parent 
> directory /usr/hdp/current/hadoop-client/conf doesn't exist
> ```
> 
> 
> Diffs
> -----
> 
>   
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py
>  98b7eb5 
> 
> Diff: https://reviews.apache.org/r/42094/diff/
> 
> 
> Testing
> -------
> 
> Verified on a new cluster installation.
> 
> Python unit tests passed.
> OK
> ----------------------------------------------------------------------
> Total run:858
> Total errors:0
> Total failures:0
> OK
> 
> 
> Thanks,
> 
> Alejandro Fernandez
> 
>

Reply via email to