potiuk commented on issue #12969:
URL: https://github.com/apache/airflow/issues/12969#issuecomment-744336412


   I experimented a bit and I think we have a 3rd problem. I registered 
manually logging.shutdown after dispose_orm (atexit handlers are executed in 
reverse order). We have protection against calling close() more than once, so 
this might even be a viable long term solution:
   
   ```
       # Ensure we close DB connections at scheduler and gunicon worker 
terminationsa
       atexit.register(dispose_orm)
       atexit.register(logging.shutdown)
   
   ```
   However we hit another problem with S3 handler:
   
   ```
   [2020-12-14 10:11:01,886] {s3_task_handler.py:208} ERROR - Could not write 
logs to 
s3://test-amazon-logging/airflowlogs/example_bash_operator/run_after_loop/2020-12-14T10:10:56.444247+00:00/1.log
                                                                                
                       
   Traceback (most recent call last):
     File 
"/usr/local/lib/python3.8/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py",
 line 201, in s3_write
       self.hook.load_string(
     File 
"/usr/local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 61, in wrapper
       return func(*bound_args.args, **bound_args.kwargs)
     File 
"/usr/local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 90, in wrapper
       return func(*bound_args.args, **bound_args.kwargs)
     File 
"/usr/local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 547, in load_string
       self._upload_file_obj(file_obj, key, bucket_name, replace, encrypt, 
acl_policy)
     File 
"/usr/local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 638, in _upload_file_obj
       client.upload_fileobj(file_obj, bucket_name, key, ExtraArgs=extra_args)
     File "/usr/local/lib/python3.8/site-packages/boto3/s3/inject.py", line 
536, in upload_fileobj
       future = manager.upload(
     File "/usr/local/lib/python3.8/site-packages/s3transfer/manager.py", line 
312, in upload
       return self._submit_transfer(
     File "/usr/local/lib/python3.8/site-packages/s3transfer/manager.py", line 
468, in _submit_transfer
       self._submission_executor.submit(
     File "/usr/local/lib/python3.8/site-packages/s3transfer/futures.py", line 
467, in submit
       future = ExecutorFuture(self._executor.submit(task))
     File "/usr/local/lib/python3.8/concurrent/futures/thread.py", line 181, in 
submit
       raise RuntimeError('cannot schedule new futures after '
   RuntimeError: cannot schedule new futures after interpreter shutdown
   
   ```
   
   
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to