Taragolis commented on code in PR #29522:
URL: https://github.com/apache/airflow/pull/29522#discussion_r1141026847


##########
airflow/providers/amazon/aws/hooks/batch_client.py:
##########
@@ -419,8 +419,46 @@ def get_job_awslogs_info(self, job_id: str) -> dict[str, 
str] | None:
 
         :param job_id: AWS Batch Job ID
         """
-        job_container_desc = 
self.get_job_description(job_id=job_id).get("container", {})
-        log_configuration = job_container_desc.get("logConfiguration", {})
+        job_desc = self.get_job_description(job_id=job_id)

Review Comment:
   > I'd imagine that multinode batch jobs would not be multi-region ?
   
   AFAIK, Batch resources are resource specific for any type of jobs
   - Compute Environment (ECS or EKS clusters)
   - Job Definition
   - Job Queues
   
   You could configure logging to another region (Cloudwatch) or supported 
logger drivers. But it configure during creation (register) Batch Job 
Definition and it couldn't change by submit job. So it should all store in one 
destination
   
   > The job runs on many nodes, but the logs all end up in the same log stream.
   
   Nope, each node has own logs within unique log stream
   
   
https://user-images.githubusercontent.com/3998685/226111922-79641781-069c-4e84-826e-36a7c53c83ef.mp4
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to