dstandish commented on a change in pull request #13832:
URL: https://github.com/apache/airflow/pull/13832#discussion_r571151494



##########
File path: airflow/providers/amazon/aws/operators/batch.py
##########
@@ -177,29 +177,26 @@ def submit_job(self, context: Dict):  # pylint: 
disable=unused-argument
             self.job_id = response["jobId"]
 
             self.log.info("AWS Batch job (%s) started: %s", self.job_id, 
response)
-
         except Exception as e:
             self.log.error("AWS Batch job (%s) failed submission", self.job_id)
             raise AirflowException(e)
 
     def monitor_job(self, context: Dict):  # pylint: disable=unused-argument
         """
         Monitor an AWS Batch job
+        monitor_job can raise an exception or an AirflowTaskTimeout can be 
raised if execution_timeout
+        is given while creating the task. These exceptions should be handled 
in taskinstance.py
+        instead of here like it was previously done

Review comment:
       i am a bit confused @potiuk
   
   > an AirflowTaskTimeout can be raised if execution_timeout is given while 
creating the task
   
   this is true of every operator -- it's something that's part of base 
operator right?
   
   maybe i miss something.  but i don't see anywhere in this operator (or its 
inherited methods) that task timeout is raised explicitly.
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to