dacort commented on issue #14089:
URL: https://github.com/apache/airflow/issues/14089#issuecomment-776262067


   This looks very similar to https://github.com/apache/airflow/issues/12969 
but I think that should have been fixed in 2.0. I've run into this with 2.0 and 
2.0.1rc2.
   
   If I update the `upload_fileobj` method in the S3 hook to not use threads, 
everything works fine.
   
https://github.com/apache/airflow/blob/fdd9b6f65b608c516b8a062b058972d9a45ec9e3/airflow/providers/amazon/aws/hooks/s3.py#L644
   
   ```python
   client.upload_fileobj(file_obj, bucket_name, key, ExtraArgs=extra_args, 
Config=TransferConfig(use_threads=False))
   ```
   
   I'm reproducing this by running a simple dummy DAG (see below) from the 
airflow scheduler:
   ```shell
    airflow tasks run dummydag dummytasks 2021-02-09 --subdir ./dummy.py 
--local --pool default_pool
   ```
   
   ```python
   # dummy.py
   from datetime import datetime
   from airflow import DAG
   from airflow.operators.dummy_operator import DummyOperator
   
   args = {
       'owner': 'airflow',
       'start_date': datetime(2017, 9, 10, 0, 0),
       'random_logic': False
   }
   
   dag = DAG(
       'dummydag',
       schedule_interval="@once",
       default_args=args
   )
   
   t3 = DummyOperator(
       task_id='dummytasks',
       dag=dag
   )
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to