amoghrajesh opened a new pull request, #61118:
URL: https://github.com/apache/airflow/pull/61118

   <!--
   Thank you for contributing!
   
   Please provide above a brief description of the changes made in this pull 
request.
   Write a good git commit message following this guide: 
http://chris.beams.io/posts/git-commit/
   
   Please make sure that your code changes are covered with tests.
   And in case of new features or big changes remember to adjust the 
documentation.
   
   Feel free to ping (in general) for the review if you do not see reaction for 
a few days
   (72 Hours is the minimum reaction time you can expect from volunteers) - we 
sometimes miss notifications.
   
   In case of an existing issue, reference it using one of the following:
   
   * closes: #ISSUE
   * related: #ISSUE
   -->
   
   ---
   
   ##### Was generative AI tooling used to co-author this PR?
   
   <!--
   If generative AI tooling has been used in the process of authoring this PR, 
please
   change below checkbox to `[X]` followed by the name of the tool, uncomment 
the "Generated-by".
   -->
   
   - [x] Yes (please specify the tool below)
   Used cursor IDE
   
   <!--
   Generated-by: [Tool Name] following [the 
guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#gen-ai-assisted-contributions)
   -->
   
   ### Why This Change?
   
   Trying to assist the client-server separation by moving worker side deadline 
functionality to `task-sdk` while keeping the DB dependent evaluation logic in 
`airflow-core`. This follows the established pattern from the assets migration: 
#58993
   
   
   ### What stays where?
   
   In simpler terms, this PR tries to address this.
   
   #### SDK
   - `DeadlineAlert`: Its the user facing class for defining deadline alerts 
(no serialization methods from now)
   - `DeadlineReference`: The same factory class for creating deadline 
references
   - `ReferenceModels.*`: Original reference implementations for backward 
compatibility
   
   The principle here it to keep the lightweight DAG authoring interface with 
no database dependencies in sdk
   
   #### Core / Serialization module
   - `SerializedDeadlineAlert`: Internal representation for core usages used 
post deserialization of a DeadlineAlert
   - `SerializedReferenceModels.*`: Reference implementations with database 
access
   - `encode_deadline_alert()` / `decode_deadline_alert()`: Centralized 
serialization functions used to ser/deser the deadline alerts
   
   
   The principle here is to keep the serialization, deserialization, and 
deadline evaluation with database access in core.
   
   ### Serialization Changes
   
   #### Structure
   Serialization format remains **unchanged** - no breaking changes to stored 
DAGs:
   
   ```json
   {
     "__type": "deadline_alert",
     "__var": {
       "reference": {"reference_type": "DagRunLogicalDateDeadline"},
       "interval": 3600.0,
       "callback": {
         "__classname__": "airflow.sdk.definitions.callback.AsyncCallback",
         "__version__": 0,
         "__data__": {"path": "...", "kwargs": {...}}
       }
     }
   }
   ```
   
   Same with the flow of control
   1. Encode (DAG Processor): `DeadlineAlert` → dict via 
`encode_deadline_alert()` using `airflow.sdk.serde`
   2. Stored as JSON in database
   3. Decode (Scheduler): dict → `SerializedDeadlineAlert` via 
`decode_deadline_alert()`
   4. Evaluate: `SerializedReferenceModels` uses database session to calculate 
deadlines
   
   One thing of note is the callback serialisation, I chose to continue using 
serde for this purpose because BaseSerialisation cannot handle callbacks. Using 
serde made sense since this part of serialisation runs in dag processor, which 
untilmately is not a _core component_ and can use task sdk. So, flow:
   
   - Uses `airflow.sdk.serde.serialize()` / `deserialize()` for proper callback 
handling
   - Runs in DAG Processor context where SDK is available
   - Callbacks fully serialize with path and kwargs (no string representations)
   
   ### Backward Compatibility
   
   - Serialization format identical to main branch
   - Reference class names unchanged (e.g., `DagRunLogicalDateDeadline` not 
`SerializedDagRunLogicalDateDeadline`)
   - Existing serialized DAGs deserialize correctly
   - Internal API only - no user-facing changes, so nothing to worry about 
hopefully
   
   
   ---
   
   * Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information. Note: commit author/co-author name and email in commits 
become permanently public when merged.
   * For fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   * When adding dependency, check compliance with the [ASF 3rd Party License 
Policy](https://www.apache.org/legal/resolved.html#category-x).
   * For significant user-facing changes create newsfragment: 
`{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in 
[airflow-core/newsfragments](https://github.com/apache/airflow/tree/main/airflow-core/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to