> On March 15, 2016, 8:19 a.m., Nate Cole wrote:
> > ambari-common/src/main/python/resource_management/libraries/functions/conf_select.py,
> >  line 526
> > <https://reviews.apache.org/r/44831/diff/1/?file=1299166#file1299166line526>
> >
> >     Maybe a bit more detail here - it's a pain enough to debug.  Maybe 
> > "Seeding versioned directory for <package> from <fromdir> to <todir>"
> 
> Jonathan Hurley wrote:
>     I tested this with both errors and non-errors with the seeding. In both 
> cases, the `cp` command provided the information needed. I didn't think we 
> needed to re-iterate here in this log statement. It's something like this:
>     
>     ```
>     2016-03-15 01:22:01,923 - Seeding versioned configuration directories for 
> spark
>     2016-03-15 01:22:01,923 - Execute['ambari-sudo.sh  -H -E cp -R -p -v 
> /usr/hdp/current/spark-client/conf/* /etc/spark/2.4.0.0-169/0'] {}
>     2016-03-15 01:22:01,928 - Unable to seed new configuration directories 
> for spark. Execution of 'ambari-sudo.sh  -H -E cp -R -p -v 
> /usr/hdp/current/spark-client/conf/* /etc/spark/2.4.0.0-169/0' returned 1. 
> cp: cannot stat `/usr/hdp/current/spark-client/conf/*': No such file or 
> directory
>     ```

Ah, I see.  Works for me!


- Nate


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/44831/#review123643
-----------------------------------------------------------


On March 14, 2016, 11:40 p.m., Jonathan Hurley wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/44831/
> -----------------------------------------------------------
> 
> (Updated March 14, 2016, 11:40 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez, Dmitro Lisnichenko, and Nate 
> Cole.
> 
> 
> Bugs: AMBARI-15419
>     https://issues.apache.org/jira/browse/AMBARI-15419
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> During an upgrade, configuration files from the old stack are not copied to 
> the new configuration folders created using {{conf-select}}. This means that 
> files which Ambari does not track, are never included in the new 
> configuration directories.
> 
> This is what happens on a clean install:
> ```
> /etc/foo/conf (physical conf files placed here)
> /usr/hdp/<version>/foo/conf -> /etc/foo/conf
> ```
> 
> Ambari then uses {{conf-select}} to change this:
> ```
> /etc/foo/conf.backup (contents of original /etc/foo/conf folder)
> /etc/foo/conf -> /usr/hdp/<version>/foo/conf
> /usr/hdp/<version>/foo/conf -> /etc/foo/<version>/0
> /etc/foo/<version>/0   (physical conf files placed here)
> ```
> 
> But in this scenario, we make sure to seed {{/etc/foo/<version>/0}} with the 
> files which were in {{/etc/foo/conf}} originally, including files which we 
> don't track. This is to prevent files, like JKS keystores, from being lost.
> 
> *Now the upgrade scenario:*
> ----
>  If you already have {{/usr/hdp/2.3.0.0}} installed, that means that you have 
> {{/etc/foo/conf}} and all associated files already on the disk. When you 
> distribute HDP 2.4, {{/usr/hdp/2.4.0.0}} is created. However, it cannot 
> overwrite any existing configurations since you haven't upgraded yet. So, it 
> basically does nothing with the configurations. We invoke {{conf-select}} to 
> create {{/etc/foo/2.4.0.0/0}}, but it is never seeded with any values from 
> {{/etc/foo/2.3.0.0/0}}.
> 
> 
> Diffs
> -----
> 
>   
> ambari-common/src/main/python/resource_management/libraries/functions/conf_select.py
>  59c717b 
>   ambari-server/src/main/resources/custom_actions/scripts/install_packages.py 
> 08bdcc3 
>   
> ambari-server/src/test/python/stacks/2.0.6/hooks/after-INSTALL/test_after_install.py
>  daee726 
> 
> Diff: https://reviews.apache.org/r/44831/diff/
> 
> 
> Testing
> -------
> 
> ----------------------------------------------------------------------
> Total run:924
> Total errors:0
> Total failures:0
> OK
> 
> 
> Thanks,
> 
> Jonathan Hurley
> 
>

Reply via email to