Gentle ping. Any idea of dealing this kind of scenarios.

On 9/16/2018 10:35 AM, Tharun M wrote:
Hi,
We are also facing the same issue. /user/hive/warehouse always reaches hard 
quota and jobs fail. Often we reachout to users to delete old tables/db’s. Is 
there a good way to handle this at enterprise level ( 100’s of users and 1000’s 
of databases)?

On Sun, Sep 16, 2018 at 00:31 Mahender Sarangam 
<mahender.bigd...@outlook.com<mailto:mahender.bigd...@outlook.com>> wrote:
Hi,

Our storage holding  TB of \User folder data. it has users and their logs. is 
there a way to set limit or quota and automatically clean up folder if it 
exceeds beyond certain limit.


$ sudo -u hdfs hdfs dfsadmin -setSpaceQuota 10g /user

I know above command sets the limit. But is there better way to do cleanup.




Reply via email to