Use yarn queues:

https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/FairScheduler.html

> Am 27.10.2019 um 06:41 schrieb Chetan Khatri <chetan.opensou...@gmail.com>:
> 
> 
> Could someone please help me to understand better..
> 
>> On Thu, Oct 17, 2019 at 7:41 PM Chetan Khatri <chetan.opensou...@gmail.com> 
>> wrote:
>> Hi Users,
>> 
>> I do submit X number of jobs with Airflow to Yarn as a part of workflow for 
>> Y customer. I could potentially run workflow for customer Z but I need to 
>> check that how much resources are available over the cluster so jobs for 
>> next customer should start.
>> 
>> Could you please tell what is the best way to handle this. Currently, I am 
>> just checking availableMB > 100 then trigger next Airflow DAG over Yarn.
>> 
>> GET http://rm-http-address:port/ws/v1/cluster/metrics
>> Thanks.

Reply via email to