dejii commented on PR #49768:
URL: https://github.com/apache/airflow/pull/49768#issuecomment-3166390240

   I think that it's fine to just use the `S3Hook` in subsequent tasks instead. 
Something like: 
   ```python
   hook = S3Hook(aws_conn_id='aws_default')
   s3_objects = hook.list_keys(
       bucket_name='s3-bucket-name',
       prefix='some-prefix/',
   )
   ```
   
   I'm suggesting that because the behaviour won't be consistent if the 
triggerer encountered an interruption that caused a restart i.e files will be 
copied but the return statement ends up being `None`
   
https://github.com/apache/airflow/blob/30c74c742d089e52bf28ada0d92cd4f93533c030/providers/google/src/airflow/providers/google/cloud/transfers/s3_to_gcs.py#L357
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to