ashb commented on a change in pull request #16352:
URL: https://github.com/apache/airflow/pull/16352#discussion_r657801655
##########
File path: airflow/models/dagrun.py
##########
@@ -123,10 +127,47 @@ def __init__(
run_type: Optional[str] = None,
dag_hash: Optional[str] = None,
creating_job_id: Optional[int] = None,
+ data_interval: Optional[Tuple[datetime, datetime]] = None,
):
+ # The preferred signature *requires* the data_interval argument. The
+ # legacy form of accepting an optional execution_date (and disallowing
+ # data_interval) is deprecated but accepted for compatibility.
+ if data_interval is None:
+ warnings.warn(
+ "Creating a DagRun without data_interval is deprecated.",
+ DeprecationWarning,
+ stacklevel=2,
+ )
+ if execution_date is None:
+ execution_date = timezone.utcnow()
+ self.data_interval_start = execution_date
+ if run_type == DagRunType.MANUAL:
+ self.data_interval_end = execution_date
+ else:
+ # This is terribly inefficient, but the caller should fix this
:)
+ # The local import is necessary to avoid circular reference.
+ from airflow.models.serialized_dag import SerializedDagModel
+
+ with create_session() as session:
+ serialized = session.query(SerializedDagModel)
+ dag = serialized.filter(SerializedDagModel.dag_id ==
dag_id).first().dag
Review comment:
```suggestion
dag = serialized.filter(SerializedDagModel.dag_id ==
dag_id).one().dag
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]