ephraimbuddy commented on a change in pull request #19528:
URL: https://github.com/apache/airflow/pull/19528#discussion_r747235478
##########
File path: airflow/jobs/scheduler_job.py
##########
@@ -913,26 +913,23 @@ def _create_dag_runs(self, dag_models:
Collection[DagModel], session: Session) -
creating_job_id=self.id,
)
active_runs_of_dags[dag.dag_id] += 1
- self._update_dag_next_dagruns(dag, dag_model,
active_runs_of_dags[dag.dag_id])
+ if self._should_update_dag_next_dagruns(dag, dag_model,
active_runs_of_dags[dag.dag_id]):
Review comment:
This will still have the missing dagruns issue because it's calling
`dag.get_next_data_interval` when a run has finished. This will skip it. Using
the latest dagrun when the run has just finished and calling
`dag.get_run_data_interval` resolves the issue. The `get_next_data_interval`
gives us the `next` dagrun's data interval but `get_run_data_interval` when
supplied with a recent dagrun gives a dagruns data interval that sets the
DagRunInfo correctly especially the `next_dagrun_create_after` for the dag.
That way, the nullified `dag_model.next_dagrun_create_after` is set again
properly.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]