jackemuk commented on issue #33658:
URL: https://github.com/apache/airflow/issues/33658#issuecomment-1693249771
The dag code is pretty simple. There are two tasks, the first is to just
pull the dag config and then pass it to the next external_python task because
the dag config {params} is not passed in the context when using
task.external_python decorator. I've renamed the dag tasks for security
purposes.
We are able to delete new dag runs. The first 4 dag runs are unable to be
deleted. Would deleting the dag and then reloading it resolve the issue?
I'm also noticing when we redeploy the dag, without changes, the first task
is loosing its run history. (see attached pic below)
```
from airflow.decorators import dag, task
from airflow.operators.python import get_current_context
import pendulum
import datetime as dt
from utils.email import generic_task_failure_alert
python_venv = "/path/to/virtualenv/bin/python"
worker_queue = "default"
@dag(
schedule=None,
start_date=pendulum.datetime(2022, 3, 24, tz="US/Eastern"),
catchup=False,
default_args={
"owner": "jackemuk",
"on_failure_callback": generic_task_failure_alert,
"queue": worker_queue,
"expect_airflow": False,
},
)
def sync_test_data():
@task()
def get_dag_config():
context = get_current_context()
return context.get("params")
@task.external_python(doc=None, python=python_venv,
retry_delay=dt.timedelta(minutes=5))
def test_data(params):
from logic.sync_data import SyncData
sync = SyncData()
sync.sync_test_data(params=params)
test_data(get_dag_config())
sync_test_data_dag = sync_test_data()
```

--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]