FFCMSouza commented on issue #38935:
URL: https://github.com/apache/airflow/issues/38935#issuecomment-2054223724

   Anyway, I managed to find a way to solve my problem overwriting the 
`on_kill` method in my spark operator.
   The only downside is that the `on_kill` method doesn't have access to the 
task context. To solve this problem, I created local variables on the `execute` 
method with the context information I need.
   
   ```
   def on_kill(self):
       logging.info('starting on_kill')
       cancel_step(self.project, self.dag_name, self.run_id, self.task_id)
   ```
       def execute(self, context):
           self.log.info("starting SparkToDataLake.execute")
   
           self.dag_name = context.get('task_instance').dag_id
           self.run_id = context.get('run_id')
           self.task_id = context.get('task_instance').task_id + 
str(context.get('task_instance').map_index)
   
![image](https://github.com/apache/airflow/assets/68713515/132a1c57-f738-44c2-b4ef-053c3d8787e2)
   
   PS: The `on_kill` method is also called on the `AirflowTaskTimeout` 
exception.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to