kuga14 opened a new issue #14610:
URL: https://github.com/apache/airflow/issues/14610


   **Version: v2.0.1**
   **Git Version:.release:2.0.1+beb8af5ac6c438c29e2c186145115fb1334a3735**
   **Docker Image Tag 2.0.1-python3.8-build**
   
   **What happened:**
   Try to use GCSToS3Operator to sync data beetwen GCS and Yandex Object 
Storage. 
   
   Getting error:
   
   ```
   [2021-03-04 19:12:37,937] {taskinstance.py:1455} ERROR - An error occurred 
(AccessDenied) when calling the PutObject operation: Access Denied
   Traceback (most recent call last):
     File 
"/root/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 
1112, in _run_raw_task
       self._prepare_and_execute_task_with_callbacks(context, task)
     File 
"/root/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 
1285, in _prepare_and_execute_task_with_callbacks
       result = self._execute_task(context, task_copy)
     File 
"/root/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 
1315, in _execute_task
       result = task_copy.execute(context=context)
     File 
"/root/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/transfers/gcs_to_s3.py",
 line 183, in execute
       s3_hook.load_bytes(
     File 
"/root/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 61, in wrapper
       return func(*bound_args.args, **bound_args.kwargs)
     File 
"/root/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 90, in wrapper
       return func(*bound_args.args, **bound_args.kwargs)
     File 
"/root/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 600, in load_bytes
       self._upload_file_obj(file_obj, key, bucket_name, replace, encrypt, 
acl_policy)
     File 
"/root/.local/lib/python3.8/site-packages/airflow/providers/amazon/aws/hooks/s3.py",
 line 654, in _upload_file_obj
       client.upload_fileobj(file_obj, bucket_name, key, ExtraArgs=extra_args)
     File "/root/.local/lib/python3.8/site-packages/boto3/s3/inject.py", line 
539, in upload_fileobj
       return future.result()
     File "/root/.local/lib/python3.8/site-packages/s3transfer/futures.py", 
line 106, in result
       return self._coordinator.result()
     File "/root/.local/lib/python3.8/site-packages/s3transfer/futures.py", 
line 265, in result
       raise self._exception
     File "/root/.local/lib/python3.8/site-packages/s3transfer/tasks.py", line 
126, in __call__
       return self._execute_main(kwargs)
     File "/root/.local/lib/python3.8/site-packages/s3transfer/tasks.py", line 
150, in _execute_main
       return_value = self._main(**kwargs)
     File "/root/.local/lib/python3.8/site-packages/s3transfer/upload.py", line 
692, in _main
       client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
     File "/root/.local/lib/python3.8/site-packages/botocore/client.py", line 
357, in _api_call
       return self._make_api_call(operation_name, kwargs)
     File "/root/.local/lib/python3.8/site-packages/botocore/client.py", line 
676, in _make_api_call
       raise error_class(parsed_response, operation_name)
   botocore.exceptions.ClientError: An error occurred (AccessDenied) when 
calling the PutObject operation: Access Denied
   ```
   
   GCSToS3Operator calls S3Hook.load_bytes function as: 
   ```
   s3_hook.load_bytes(
                       cast(bytes, file_bytes), key=dest_key, 
replace=self.replace, acl_policy=self.s3_acl_policy
                   )
   ```
   
   When I use  S3Hook in PythonOperator to do the same thing I additionaly 
specify bucket_name parameter:
   ```
   s3_hook.load_bytes(
                           cast(bytes, file_bytes), key=dest_key, 
bucket_name=bucket_name, replace=replace
                       )
   ```
   and it works.
   
   
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to