vikramcse commented on a change in pull request #15174:
URL: https://github.com/apache/airflow/pull/15174#discussion_r606673340
##########
File path: airflow/api_connexion/endpoints/dag_run_endpoint.py
##########
@@ -240,6 +240,18 @@ def post_dag_run(dag_id, session):
post_body = dagrun_schema.load(request.json, session=session)
except ValidationError as err:
raise BadRequest(detail=str(err))
+
+ dagrun_with_execution_date = (
+ session.query(DagRun).filter(DagRun.dag_id == dag_id,
+ DagRun.execution_date ==
post_body["execution_date"]).first()
+ )
+
+ if dagrun_with_execution_date:
+ raise AlreadyExists(
+ detail=f"DAGRun with DAG ID: '{dag_id}' and DAGRun ExecutionDate:
'{post_body['execution_date']}' already "
+ f"exists"
+ )
+
Review comment:
Originally I have decided to go with that logic, but there are two cases
which can happen
1. the DagRun.run_id is same and DagRun.execution_date is same
2. the DagRun.run_id is different and DagRun.execution_date is same
in both the cases it should throw the execution_date error as there are two
unique constraints
UniqueConstraint('dag_id', 'execution_date')
UniqueConstraint('dag_id', 'run_id')
If I added the DagRun.execution_date in the same query on line 256 then, we
are missing the AlreadyExists error when the DagRun.run_id is different,
because of this the `dagrun_instance` will be a None result and the DagRun will
get created.
let me know if anything is wrong with my logic
Thanks
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]