pankajkoti commented on code in PR #31201:
URL: https://github.com/apache/airflow/pull/31201#discussion_r1191924832


##########
airflow/providers/apache/livy/operators/livy.py:
##########
@@ -207,4 +207,5 @@ def execute_complete(self, context: Context, event: 
dict[str, Any]) -> Any:
             self.task_id,
             event["response"],
         )
+        context["ti"].xcom_push(key="app_id", 
value=self.get_hook().get_batch(event["batch_id"])["appId"])

Review Comment:
   yes, currently we push the Spark appId not for all terminal states but only 
in case of successful run of the batch job.
   
   If we implement your suggestion @pankajastro it would be an early push to 
XCOM irrespective of the batch job final status. It will depend if the 
downstream tasks need this XCOM value even if the task has failed. I would 
avoid pushing an additional XCOM record in the metadata database if it is not 
really needed.
   
   @bdsoha do you think from your experience that pushing the appId to XCOM 
would benefit or might be needed for downstream tasks even if the task has 
failed?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to