Hey Everyone, We recently added a subdag with over 200 tasks on it and any time airflow is running the cpu and memory usage spikes taking down the server. I am putting the subdag in a factory outside of the dags folder and importing it into a dag. We are using the Celery Executor, and running the worker, webserver, scheduler, and redis on a t2.medium instance with everything in a separate docker container. Is this normal? Why might this subdag be spiking our memory and cpu?
-- Alexander Keating Software Engineer | WayUp <http://www.wayup.com> 646.535.8724 | [email protected] We're hiring at WayUp! <https://www.wayup.com/joinus/?utm_source=email&utm_medium=opsignature&utm_campaign=joinus> We help students get hired. <http://bit.ly/sigsigsig>
