ivanrezic opened a new issue, #31419:
URL: https://github.com/apache/airflow/issues/31419

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   I am using MWAA, airflow version **2.5.1.**
   
   And I wanted to use `RedshiftToS3Operator` but I am getting this error 
message:
   
   ```
   [2023-05-19, 14:06:08 UTC] {{connection_wrapper.py:334}} INFO - AWS 
Connection (conn_id='business_integration_gainsight', conn_type='aws') 
credentials retrieved from login and password.
   [2023-05-19, 14:06:08 UTC] {{redshift_to_s3.py:158}} INFO - Executing UNLOAD 
command...
   [2023-05-19, 14:06:08 UTC] {{base.py:73}} INFO - Using connection ID 
'redshift_postgres' for task execution.
   [2023-05-19, 14:06:08 UTC] {{taskinstance.py:1768}} ERROR - Task failed with 
exception
   Traceback (most recent call last):
     File 
"/usr/local/airflow/.local/lib/python3.10/site-packages/airflow/providers/amazon/aws/transfers/redshift_to_s3.py",
 line 159, in execute
       redshift_hook.run(unload_query, self.autocommit, 
parameters=self.parameters)
     File 
"/usr/local/airflow/.local/lib/python3.10/site-packages/airflow/providers/common/sql/hooks/sql.py",
 line 342, in run
       with closing(self.get_conn()) as conn:
     File 
"/usr/local/airflow/.local/lib/python3.10/site-packages/airflow/providers/amazon/aws/hooks/redshift_sql.py",
 line 129, in get_conn
       return redshift_connector.connect(**conn_kwargs)
   TypeError: connect() got an unexpected keyword argument 'aws_conn_id'
   ```
   
   This is the my operator:
   
   ```
   move_data_from_view_to_s3 = RedshiftToS3Operator(
       task_id=f"{task_group_name}_view_to_s3",
       schema=params.redshift_schema,
       table=created_view_name,
       s3_bucket=params.destination_bucket,
       s3_key=params.destination_key,
       redshift_conn_id=params.redshift_conn_id,
       aws_conn_id=params.gainsight_conn_id,
       include_header=True,
       unload_options=["ALLOWOVERWRITE", "PARALLEL OFF", "DELIMITER ';'"],
       dag=dag,
   )
   ```
   
   ### What you think should happen instead
   
   There should be no error messages, and command should just execute 
successfuly.
   
   ### How to reproduce
   
   Run:
   
   ```
   move_data_from_view_to_s3 = RedshiftToS3Operator(
       task_id=f"{task_group_name}_view_to_s3",
       schema=params.redshift_schema,
       table=created_view_name,
       s3_bucket=params.destination_bucket,
       s3_key=params.destination_key,
       redshift_conn_id=params.redshift_conn_id,
       aws_conn_id=params.gainsight_conn_id,
       include_header=True,
       unload_options=["ALLOWOVERWRITE", "PARALLEL OFF", "DELIMITER ';'"],
       dag=dag,
   )
   ```
   
   with apache-airflow-providers-amazon==7.1.0
   and airflow 2.5.1
   
   ### Operating System
   
   Debian GNU/Linux 10 (buster)
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==7.1.0
   
   ### Deployment
   
   Amazon (AWS) MWAA
   
   ### Deployment details
   
   K8s (EKS) Server Version: v1.24.10-eks-48e63af
   
   ### Anything else
   
   Every time I run `RedshiftToS3Operator`
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to