tanvn opened a new issue, #47902:
URL: https://github.com/apache/airflow/issues/47902

   ### Apache Airflow Provider(s)
   
   amazon
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon          9.2.0
   apache-airflow-providers-apache-hdfs     4.7.0
   apache-airflow-providers-apache-hive     9.0.0
   apache-airflow-providers-apache-spark    5.0.0
   apache-airflow-providers-celery          3.10.0
   apache-airflow-providers-cncf-kubernetes 10.1.0
   apache-airflow-providers-common-compat   1.3.0
   apache-airflow-providers-common-io       1.5.0
   apache-airflow-providers-common-sql      1.21.0
   apache-airflow-providers-docker          4.0.0
   apache-airflow-providers-elasticsearch   6.0.0
   apache-airflow-providers-fab             1.5.2
   apache-airflow-providers-ftp             3.12.0
   apache-airflow-providers-google          12.0.0
   apache-airflow-providers-grpc            3.7.0
   apache-airflow-providers-http            5.0.0
   apache-airflow-providers-imap            3.8.0
   apache-airflow-providers-mysql           6.0.0
   apache-airflow-providers-postgres        6.0.0
   apache-airflow-providers-redis           4.0.0
   apache-airflow-providers-sendgrid        4.0.0
   apache-airflow-providers-sftp            5.0.0
   apache-airflow-providers-slack           9.0.0
   apache-airflow-providers-smtp            1.9.0
   apache-airflow-providers-sqlite          4.0.0
   apache-airflow-providers-ssh             4.0.0
   
   ### Apache Airflow version
   
   2.10.5
   
   ### Operating System
   
   Ubuntu 20.04.6 LTS (Focal Fossa)
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   helm with k8s
   
   ### What happened
   
   When a task finishes, its logs will be sent to AWS S3.
   I have been using 2.8.4 so far and there was no problem.
   After upgrading to 2.10.5, we are facing the below error
   
   ```
   [2025-03-18, 10:02:36 UTC] {s3_task_handler.py:190} ERROR - Could not write 
logs to 
s3://my-bucket/airflow-logs/dag_id=my_dag/run_id=scheduled__2025-03-10T03:00:00+00:00/task_id=task_1/attempt=5.log
   Traceback (most recent call last):
     File 
"/usr/local/lib/python3.10/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py",
 line 179, in s3_write
       self.hook.load_string(
     File 
"/usr/local/lib/python3.10/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 153, in wrapper
       return func(*bound_args.args, **bound_args.kwargs)
     File 
"/usr/local/lib/python3.10/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 126, in wrapper
       return func(*bound_args.args, **bound_args.kwargs)
     File 
"/usr/local/lib/python3.10/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 1198, in load_string
       self._upload_file_obj(f, key, bucket_name, replace, encrypt, acl_policy)
     File 
"/usr/local/lib/python3.10/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 1281, in _upload_file_obj
       client.upload_fileobj(
     File "/usr/local/lib/python3.10/site-packages/boto3/s3/inject.py", line 
642, in upload_fileobj
       return future.result()
     File "/usr/local/lib/python3.10/site-packages/s3transfer/futures.py", line 
103, in result
       return self._coordinator.result()
     File "/usr/local/lib/python3.10/site-packages/s3transfer/futures.py", line 
264, in result
       raise self._exception
     File "/usr/local/lib/python3.10/site-packages/s3transfer/tasks.py", line 
135, in __call__
       return self._execute_main(kwargs)
     File "/usr/local/lib/python3.10/site-packages/s3transfer/tasks.py", line 
158, in _execute_main
       return_value = self._main(**kwargs)
     File "/usr/local/lib/python3.10/site-packages/s3transfer/upload.py", line 
796, in _main
       client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
     File "/usr/local/lib/python3.10/site-packages/botocore/client.py", line 
569, in _api_call
       return self._make_api_call(operation_name, kwargs)
     File "/usr/local/lib/python3.10/site-packages/botocore/client.py", line 
1023, in _make_api_call
       raise error_class(parsed_response, operation_name)
   botocore.exceptions.ClientError: An error occurred (MissingContentLength) 
when calling the PutObject operation: The internal error code is -2011
   ```
   
   ### What you think should happen instead
   
   I think we are hitting 
   https://github.com/boto/boto3/issues/4398
   
   as I have followed the instruction by downgrading boto3 to `1.35.99`
   the error has gone.
   
   ### How to reproduce
   
   Just enable remote logging in my helm chart
   ```
     logging:
       logging_level: INFO
       remote_logging: "true"
       remote_base_log_folder: "s3://my-bucket/airflow-logs"
       remote_log_conn_id: "airflow_log_conn"
       encrypt_s3_logs: false
   ```
   
   ### Anything else
   
   This happens 100% to all my tasks on Airflow 2.10.5
   
   ### Are you willing to submit PR?
   
   - [x] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to