potiuk commented on pull request #13832:
URL: https://github.com/apache/airflow/pull/13832#issuecomment-765603917


   > > How would that solve it ? From what I understand it would simply result 
in a different exception thrown - but the flow will remain the same?
   > 
   > @potiuk If you check 
[taskinstance.py](https://github.com/apache/airflow/blob/master/airflow/models/taskinstance.py#L1309)
 code, you can check that by removing the try-catch block exception raised by 
timeout class will now be handled by the taskinstance instead of the 
batch_operator class.
   > 
   > ```
   > try:
   >     with timeout(task_copy.execution_timeout.total_seconds()):
   >         result = task_copy.execute(context=context)
   > except AirflowTaskTimeout:
   >     task_copy.on_kill()
   >     raise
   > ```
   
   Yeah, I know that. But I was wondering where the "AirflowTaskTimeout" is 
thrown. the AWS provider is well.. a bit convoluted ...  
   
   So I guess (purely because this is the only place where the exception is 
thrown in AWS provider) it is the "wait_for_task_execution"  in AWSDataSycnHook 
that throws it. 
   
   If so - can you please:
   
   a) add some comments describing what's going on - mention that the method 
might throw the timeout exception and that it should be passed back to execute 
method (this way the next person coming here will not have to dig deeper and 
will not add the try/catch back.
   
   b) can you please add a unit test case with this case to prevent regressions?
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to