sideef5ect opened a new issue, #41522: URL: https://github.com/apache/airflow/issues/41522
### Description Currently, Apache Airflow enforces a unique constraint on the combination of dag_id and execution_date for identifying DAG runs. This constraint ensures that each DAG run is uniquely identified by its DAG ID and execution date. However, there are scenarios where this constraint can be limiting and can not schedule multiple DAG run at the same execution_date. Additionally, recent versions of Airflow have introduced another unique key (dag_id, run_id), which should be sufficient to uniquely identify DAG runs. ### Use case/motivation Use Case Example: Cannot Schedule DAG Run at the Same Execution Date: Scenario: A data engineering team needs to reprocess a daily ETL pipeline due to an upstream data issue. The pipeline is scheduled to run at midnight every day. Problem: Due to the unique constraint on (dag_id, execution_date), the team cannot schedule a run of the DAG for the same execution date (midnight of the same day) without changing the execution date or creating a new DAG. Impact: This limitation forces the team to implement workarounds, such as modifying the execution date or duplicating the DAG, which complicates workflow management and increases the risk of errors. Solution: Removing the unique constraint on (dag_id, execution_date) would allow the team to schedule multiple runs of the DAG for the same execution date, simplifying reprocessing and improving workflow efficiency. ### Related issues https://github.com/apache/airflow/issues/15150 ### Are you willing to submit a PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
