mik-laj commented on a change in pull request #21003:
URL: https://github.com/apache/airflow/pull/21003#discussion_r789672917



##########
File path: airflow/providers/google/cloud/transfers/s3_to_gcs.py
##########
@@ -21,9 +21,13 @@
 
 from airflow.exceptions import AirflowException
 from airflow.providers.amazon.aws.hooks.s3 import S3Hook
-from airflow.providers.amazon.aws.operators.s3_list import S3ListOperator
 from airflow.providers.google.cloud.hooks.gcs import GCSHook, _parse_gcs_url, 
gcs_object_is_directory
 
+try:
+    from airflow.providers.amazon.aws.operators.s3 import S3ListOperator
+except ImportError:
+    from airflow.providers.amazon.aws.operators.s3_list import S3ListOperator

Review comment:
       This will be problematic as we do not yet have the technical ability to 
enforce a specific provider version. We would have to add some code to add this 
ability. I think the current solution is sufficient and even more end-user 
friendly. It is also in line with how we approach backward compatibility with 
the airflow core.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to