amoghrajesh opened a new pull request, #63701:
URL: https://github.com/apache/airflow/pull/63701
<!-- SPDX-License-Identifier: Apache-2.0
https://www.apache.org/licenses/LICENSE-2.0 -->
<!--
Thank you for contributing!
Please provide above a brief description of the changes made in this pull
request.
Write a good git commit message following this guide:
http://chris.beams.io/posts/git-commit/
Please make sure that your code changes are covered with tests.
And in case of new features or big changes remember to adjust the
documentation.
Feel free to ping (in general) for the review if you do not see reaction for
a few days
(72 Hours is the minimum reaction time you can expect from volunteers) - we
sometimes miss notifications.
In case of an existing issue, reference it using one of the following:
* closes: #ISSUE
* related: #ISSUE
-->
closes: https://github.com/apache/airflow/issues/63438
### Problems
1. Migration Field Name Mismatch
Migration `0101_3_2_0_ui_improvements_for_deadlines.py` expected
`callback_def` but 3.1.x data has `callback`:
The DB migration was looking for:
`CALLBACK_KEY = "callback_def"` but deadlines in 3.1.x were of format:
`{"reference": {...}, "callback": {...}, "interval": 900}`
leading to failing migrations
2. Runtime Format Mismatch
3.1.x serialized deadline alerts is a list of dicts: `{"deadline":
[{"reference": {...}, "interval": 900, "callback": {...}}]}`
but 3.2+ expects: `{"deadline": ["01933eb3-7890-7123-8000-123456789abc"]}`
leading to issues during scanning dags in dag processor with error like:
`AttributeError: 'dict' object has no attribute 'replace'`
### Fixes
To fix this, I added two fixes:
1. Changed field name in migration to match 3.1.x data
2. Added check in `_try_reuse_deadline_uuids()` to handle old formatted data
which acts as a migration shim.
### Testing
DAG:
```python
from datetime import datetime, timedelta
from airflow.providers.standard.operators.bash import BashOperator
from airflow.sdk import DAG
from airflow.providers.standard.operators.empty import EmptyOperator
from airflow.sdk.definitions.deadline import AsyncCallback,
DeadlineReference, DeadlineAlert
async def custom_async_callback(**kwargs):
"""Handle deadline violation with custom logic."""
context = kwargs.get("context", {})
print(f"Deadline exceeded for Dag {context.get('dag_run',
{}).get('dag_id')}!")
print(f"Context: {context}")
print(f"Alert type: {kwargs.get('alert_type')}")
with DAG(
dag_id="deadline_alert_test_dag",
deadline=DeadlineAlert(
reference=DeadlineReference.DAGRUN_QUEUED_AT,
interval=timedelta(seconds=15),
callback=AsyncCallback(
custom_async_callback,
kwargs={
"text": "🚨 Dag {{ dag_run.dag_id }} missed deadline at {{
deadline.deadline_time }}. DagRun: {{ dag_run }}"
},
),
),
):
BashOperator(task_id="example_task", bash_command="sleep 30")
```
DAG runs fine before migration and shows up without errors and executes fine
even later:
<img width="2558" height="1090" alt="image"
src="https://github.com/user-attachments/assets/ed0194d7-4bc4-439a-a336-162caf072eac"
/>
---
##### Was generative AI tooling used to co-author this PR?
<!--
If generative AI tooling has been used in the process of authoring this PR,
please
change below checkbox to `[X]` followed by the name of the tool, uncomment
the "Generated-by".
-->
- [x] No
<!--
Generated-by: [Tool Name] following [the
guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#gen-ai-assisted-contributions)
-->
---
* Read the **[Pull Request
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
for more information. Note: commit author/co-author name and email in commits
become permanently public when merged.
* For fundamental code changes, an Airflow Improvement Proposal
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
is needed.
* When adding dependency, check compliance with the [ASF 3rd Party License
Policy](https://www.apache.org/legal/resolved.html#category-x).
* For significant user-facing changes create newsfragment:
`{pr_number}.significant.rst`, in
[airflow-core/newsfragments](https://github.com/apache/airflow/tree/main/airflow-core/newsfragments).
You can add this file in a follow-up commit after the PR is created so you
know the PR number.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]