luongthanhlam opened a new issue, #30208:
URL: https://github.com/apache/airflow/issues/30208
### Apache Airflow version
2.5.2
### What happened
When attempting to trigger the same dag twice in unit test, it fails:
```
def do_execute(self, cursor, statement, parameters, context=None):
> cursor.execute(statement, parameters)
E sqlalchemy.exc.IntegrityError: (sqlite3.IntegrityError) UNIQUE
constraint failed: dag_run.dag_id, dag_run.run_id
E [SQL: INSERT INTO dag_run (dag_id, queued_at, execution_date,
start_date, end_date, state, run_id, creating_job_id, external_trigger,
run_type, conf, data_interval_start, data_interval_end,
last_scheduling_decision, dag_hash, log_template_id, updated_at) VALUES (?, ?,
?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, (SELECT max(log_template.id) AS max_1
E FROM log_template), ?)]
E [parameters: ('my_custom_operator_dag', None, '2021-09-13
00:00:00.000000', '2021-09-14 00:00:00.000000', None, <DagRunState.RUNNING:
'running'>, 'manual__2021-09-13T00:00:00+00:00', None, 0, <DagRunType.MANUAL:
'manual'>, <memory at 0x116f5c400>, '2021-09-13 00:00:00.000000', '2021-09-14
00:00:00.000000', None, None, '2023-03-21 09:57:43.268050')]
E (Background on this error at: https://sqlalche.me/e/14/gkpj)
.venv/lib/python3.9/site-packages/sqlalchemy/engine/default.py:736:
IntegrityError
```
### What you think should happen instead
_No response_
### How to reproduce
Followed the sample code in the docs
https://airflow.apache.org/docs/apache-airflow/stable/best-practices.html
```
import datetime
import pendulum
import pytest
from airflow import DAG
from airflow.utils.state import DagRunState, TaskInstanceState
from airflow.utils.types import DagRunType
DATA_INTERVAL_START = pendulum.datetime(2021, 9, 13, tz="UTC")
DATA_INTERVAL_END = DATA_INTERVAL_START + datetime.timedelta(days=1)
TEST_DAG_ID = "my_custom_operator_dag"
TEST_TASK_ID = "my_custom_operator_task"
@pytest.fixture()
def dag():
with DAG(
dag_id=TEST_DAG_ID,
schedule="@daily",
start_date=DATA_INTERVAL_START,
) as dag:
MyCustomOperator(
task_id=TEST_TASK_ID,
prefix="s3://bucket/some/prefix",
)
return dag
def test_my_custom_operator_execute_no_trigger(dag):
dagrun = dag.create_dagrun(
state=DagRunState.RUNNING,
execution_date=DATA_INTERVAL_START,
data_interval=(DATA_INTERVAL_START, DATA_INTERVAL_END),
start_date=DATA_INTERVAL_END,
run_type=DagRunType.MANUAL,
)
ti = dagrun.get_task_instance(task_id=TEST_TASK_ID)
ti.task = dag.get_task(task_id=TEST_TASK_ID)
ti.run(ignore_ti_state=True)
assert ti.state == TaskInstanceState.SUCCESS
# Assert something related to tasks results.
```
Run this unit test twice and you'll get the error
### Operating System
MacOS 12.4
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]