matthewblock opened a new issue, #27587:
URL: https://github.com/apache/airflow/issues/27587

   ### Apache Airflow Provider(s)
   
   amazon
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==6.0.0
   
   ### Apache Airflow version
   
   2.4.2
   
   ### Operating System
   
   debian buster
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   [This PR 25980](https://github.com/apache/airflow/pull/25980) removed s3 
`conn_type`. [The release notes for 
6.0.0](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/index.html#breaking-changes)
 indicate the user should instead use `aws` (Amazon Web Services) for 
`conn_type`:
   
   > In order to restore ability to test connection you need to change 
connection type from Amazon S3 (conn_type="s3") to Amazon Web Services 
(conn_type="aws") manually.
   
   `S3Hook`, which assumed s3 `conn_type`, extensively uses a decorator 
[`provide_bucket_name()`](https://github.com/apache/airflow/blob/21063267fd9764b2ca38669e8faec75d9b87179c/airflow/providers/amazon/aws/hooks/s3.py#L61):
   
   > Function decorator that provides a bucket name taken from the connection 
in case no bucket name has been passed to the function.
   
   The code for this decorator looks for the `bucket_name` in the connection 
object `schema` field. The Airflow connection UI had a text input for the 
`schema` field.
   
   Now, the replacement `aws` `conn_type` connection UI only has text inputs 
for `AWS Access Key ID`, `AWS Secret Access Key`, and `Extra`.
   
   If the user follows the recommendations of the provider docs and uses `aws` 
for `conn_type`, the decorator is rendered useless.
   
   ### What you think should happen instead
   
   Since the UI has no text input for `schema` a.k.a. `bucket_name` for 
deprecated s3 `conn_type`, either:
   
   1. The `provide_bucket_name()` decorator should instead get the 
`bucket_name` from `extras`
   2. The decorator should be removed, and the user should be forced to 
explicitly enter the `bucket_name` as an arg for each of `S3Hook`'s methods 
that require it.
   
   ### How to reproduce
   
   Try to create an Airflow connection of type `aws` (Amazon Web Services in 
UI) in the Airflow connections UI. Note that there is no field `schema`.
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to