> On Jan. 21, 2014, 6:55 p.m., Sumit Mohanty wrote:
> > Dmytro, lets not hard-code the logic this way while reading the host check 
> > file. Lets do the following:
> > * Add a constant capturing the folder name pattern and parent folder (e.g. 
> > "/tmp/hadoop-" and "/tmp") to look for
> > * Create a method that will return the list of directories to delete based 
> > on the above pattern - e.g. get_additional_dirs_to_delete()
> > * Call do_erase_dir_silent with that data under - if dirList and not 
> > DIR_SECTION in SKIP_LIST:
> > 
> > This way, if we discover other directory patterns then we can simply modify 
> > the constant. As host clean up is an evolving process we will get more bugs 
> > of this nature.

Sumit Mohanty, please see last diff (r4)


- Dmytro


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/17126/#review32393
-----------------------------------------------------------


On Jan. 21, 2014, 6:50 p.m., Dmytro Shkvyra wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/17126/
> -----------------------------------------------------------
> 
> (Updated Jan. 21, 2014, 6:50 p.m.)
> 
> 
> Review request for Ambari, Dmitro Lisnichenko and Mahadev Konar.
> 
> 
> Bugs: AMBARI-4354
>     https://issues.apache.org/jira/browse/AMBARI-4354
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> HostCleanup should also check the following directories (and clean):
> 
> /tmp/hadoop-*
> 
> For example, if the /tmp/hadoop-nagios directory is present, but it doesn't 
> have the right ownership/perms for nagios user, the Hive Metastore Nagios 
> alert will occur. I saw this after doing an install on an un-clean machine.
> 
> {code}
> Hive Metastore status
> CRIT for about a minute
> CRITICAL: Error accessing Hive Metastore status [Error creating temp dir in 
> hadoop.tmp.dir /tmp/hadoop-nagios due to Permission denied]
> {code}
> 
> An easy way to reproduce this:
> 
> 1) Perform install
> 2) Go to Nagios server machine
> 3) Change perms on /tmp/hadoop-nagios so that the nagios user does not have 
> access
> 4) The Nagios alert will fire
> 
> 
> Diffs
> -----
> 
>   ambari-agent/src/main/python/ambari_agent/HostCleanup.py 593ad16 
>   ambari-agent/src/test/python/ambari_agent/TestHostCleanup.py 94d1715 
> 
> Diff: https://reviews.apache.org/r/17126/diff/
> 
> 
> Testing
> -------
> 
> Tested on 1.3.3
> 
> 
> Thanks,
> 
> Dmytro Shkvyra
> 
>

Reply via email to