ephraimbuddy commented on a change in pull request #16182:
URL: https://github.com/apache/airflow/pull/16182#discussion_r646414422
##########
File path: airflow/jobs/scheduler_job.py
##########
@@ -1417,6 +1420,37 @@ def _clean_tis_without_dagrun(self, session):
raise
guard.commit()
+ @provide_session
+ def _missing_dag_file_cleanup(self, session: Session = None):
+ """Fails task instances and DagRuns of DAGs that no longer exist in
the dag folder"""
+ states_to_check = State.unfinished - frozenset([State.NONE,
State.SHUTDOWN])
+ tis = session.query(TI).filter(TI.state.in_(states_to_check)).all()
+ missing_dags = {}
+ dag_runs = set()
+ for ti in tis:
+ dag = self.dagbag.dags.get(ti.dag_id, None)
+ if not dag:
+ continue
Review comment:
> Loading the dag is an "expensive" operation, so this should be
reworked to add a new method on dagbag: `has_dag` that does the following:
>
> * Checks the "local" cache, if it exists, and is not older than the
confiured cache time, return True
> * Else check in the DB -- but crucially, only check for the row existing,
don't load the full DagModel object, and crucially don't load the SerializedDag.
I thought the `self.dagbag.dags` is a dictionary having all already
processed dags?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]