ecerulm commented on pull request #14399:
URL: https://github.com/apache/airflow/pull/14399#issuecomment-784474383


   I've tested this by creating a new docker image with the changes: 
   ```
   FROM apache/airflow:2.0.1
   COPY airflow /home/airflow/.local/lib/python3.6/site-packages/airflow
   ``` 
   where the `airflow` directory contains the changed files: 
   ```
   find airflow -type f
   airflow/providers/amazon/aws/hooks/s3.py
   airflow/providers/amazon/aws/log/s3_task_handler.py
   ```
   
   after building `docker build . -t 
xxxx.dkr.ecr.eu-north-1.amazonaws.com/airflow:2.0.1` and pushing to ECR with 
`docker push xxxx.dkr.ecr.eu-north-1.amazonaws.com/airflow:2.0.1` I've used the 
 the current helm chart with  
   
   ``` 
   defaultAirflowTag: 2.0.1
   defaultAirflowRepository: xxxx.dkr.ecr.eu-north-1.amazonaws.com/airflow
   ```
   
   I can see that remote logging with S3 is now working and the task pods do 
not get stuck trying to upload the log to S3.  


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to