shlomi-viz opened a new issue #18111:
URL: https://github.com/apache/airflow/issues/18111


   ### Apache Airflow Provider(s)
   
   amazon
   
   ### Versions of Apache Airflow Providers
   
   ```apache-airflow-providers-amazon==1.4.0
   apache-airflow-providers-ftp==2.0.1
   apache-airflow-providers-http==1.1.1
   apache-airflow-providers-imap==2.0.1
   apache-airflow-providers-postgres==1.0.1
   apache-airflow-providers-sftp==1.1.1
   apache-airflow-providers-sqlite==2.0.1
   apache-airflow-providers-ssh==1.3.0
   ```
   
   ### Apache Airflow version
   
   2.0.2
   
   ### Operating System
   
   Mac and Linux
   
   ### Deployment
   
   MWAA
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   When trying to upload a file to another AWS account using the `load_file` 
function, I got the following 
   ```
   An error occurred (403) when calling the HeadObject operation: Forbidden
   ```
   looking at source code the error is from 
   ```
       @provide_bucket_name
       @unify_bucket_name_and_key
       def check_for_key(self, key: str, bucket_name: Optional[str] = None) -> 
bool:
           """
           Checks if a key exists in a bucket
   
           :param key: S3 key that will point to the file
           :type key: str
           :param bucket_name: Name of the bucket in which the file is stored
           :type bucket_name: str
           :return: True if the key exists and False if not.
           :rtype: bool
           """
           try:
               self.get_conn().head_object(Bucket=bucket_name, Key=key)
               return True
           except ClientError as e:
               if e.response["ResponseMetadata"]["HTTPStatusCode"] == 404:
                   return False
               else:
                   raise e
   ```
   
   As I understand it I'm getting this error since I don't a `List` permissions 
on the bucket in the other account
   
   ### What you expected to happen
   
   AirFlow should try and upload the file, even if it already exists. In this 
section of the code we can add another check for the response code we get when 
no `List` permissions
   
   ```python
           except ClientError as e:
               if e.response["ResponseMetadata"]["HTTPStatusCode"] == 404:
                   return False
               else:
                   raise e
   ````
   
   ### How to reproduce
   
   Use the following command to upload the file to an S3 bucket that you only 
have `Get` and `Put` permissions.
   ```python 
   s3_hook.load_file(local_file_path, bucket_name=bucket_name, key=s3_key, 
acl_policy='bucket-owner-full-control')
   ````
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to