Hi Users,

I do submit *X* number of jobs with Airflow to Yarn as a part of workflow
for *Y *customer. I could potentially run workflow for customer *Z *but I
need to check that how much resources are available over the cluster so
jobs for next customer should start.

Could you please tell what is the best way to handle this. Currently, I am
just checking availableMB > 100 then trigger next Airflow DAG over Yarn.

GET http://rm-http-address:port/ws/v1/cluster/metrics

Thanks.

Reply via email to