ferruzzi commented on code in PR #61702:
URL: https://github.com/apache/airflow/pull/61702#discussion_r2789665919


##########
airflow-core/src/airflow/models/serialized_dag.py:
##########
@@ -503,9 +565,20 @@ def write_dag(
                 and existing_serialized_dag.data
                 and (existing_deadline_uuids := 
existing_serialized_dag.data.get("dag", {}).get("deadline"))
             ):
-                dag.data["dag"]["deadline"] = existing_deadline_uuids
-                deadline_uuid_mapping = {}
+                deadline_uuid_mapping = cls._try_reuse_deadline_uuids(
+                    existing_deadline_uuids,
+                    dag.data["dag"]["deadline"],
+                    session,
+                )
+
+                if deadline_uuid_mapping is not None:
+                    # All deadlines matched, reuse the UUIDs to preserve hash.
+                    dag.data["dag"]["deadline"] = 
list(deadline_uuid_mapping.keys())

Review Comment:
   That's a really good catch, I was thinking I wanted to preserve the order 
from the helper but it really doesn't matter, does it...  I think you are 
right; `existing_deadline_uuids` would be cleaner and easier to understand.  
   
   Let me think on it over lunch and see if I can think of any issues that 
would cause, but I'll likely implement that change.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to