Hey, there, I have a question on Airflow dag deployment. We had an issue that when a dag is currently running, if a new dag python file been deployed (meaning CICD process replaced the original file), the downstream tasks will be changed based on new files.
I went through airflow jobs and dag_processing code and it looks like airflow is aggressively updating the dag definition so the downstream will be changed in my case, please advise if my understanding is correct. https://github.com/apache/incubator-airflow/blob/master/ airflow/jobs.py#L1617 https://github.com/apache/incubator-airflow/blob/master/ airflow/utils/dag_processing.py#L535 If so, is there any way/option in airflow that I can skip the dag file processing while a dag is running? I would love to learn how other people usually deploy their dag and if anyone face this issue before. Thanks, Chengzhi
