Asquator commented on code in PR #61274:
URL: https://github.com/apache/airflow/pull/61274#discussion_r2771502039


##########
airflow-core/src/airflow/models/dagrun.py:
##########
@@ -1410,24 +1416,44 @@ def notify_dagrun_state_changed(self, msg: str):
         # or LocalTaskJob, so we don't want to "falsely advertise" we notify 
about that
 
     @provide_session
-    def get_last_ti(self, dag: SerializedDAG, session: Session = NEW_SESSION) 
-> TI | None:
-        """Get Last TI from the dagrun to build and pass Execution context 
object from server to then run callbacks."""
+    def get_first_ti_causing_failure(self, dag: SerializedDAG, session: 
Session = NEW_SESSION) -> TI | None:  
+        """  
+        Get the first task instance that would cause a leaf task to fail the 
run.
+        """
+
         tis = self.get_task_instances(session=session)
-        # tis from a dagrun may not be a part of dag.partial_subset,
-        # since dag.partial_subset is a subset of the dag.
-        # This ensures that we will only use the accessible TI
-        # context for the callback.
+
+        failed_leaf_tis = [  
+            ti for ti in self._tis_for_dagrun_state(dag=dag, tis=tis)  
+            if ti.state in State.failed_states  
+        ]
+          
+        if not failed_leaf_tis:
+            return None  
+

Review Comment:
   It's a shortcut to avoid the logic below.
   
   In the end, I use the `min` function that doesn't operate on empty 
collections. the `default` argument catches that, and yet there's no reason to 
scan the DAG if for some reason we have no failed tasks. I think I should add a 
warning there, because this function shouldn't be called on non-failed DAGs.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to