Kengo Seki created AIRFLOW-2247:
-----------------------------------
Summary: Fix RedshiftToS3Transfer not to fail with ValueError
Key: AIRFLOW-2247
URL: https://issues.apache.org/jira/browse/AIRFLOW-2247
Project: Apache Airflow
Issue Type: Bug
Components: aws, redshift
Reporter: Kengo Seki
I tried to use RedshiftToS3Transfer but it failed with:
{code}
/path/to/incubator-airflow/airflow/operators/redshift_to_s3_operator.py in
execute(self, context)
69 self.hook = PostgresHook(postgres_conn_id=self.redshift_conn_id)
70 self.s3 = S3Hook(aws_conn_id=self.aws_conn_id)
---> 71 a_key, s_key = self.s3.get_credentials()
72 unload_options = '\n\t\t\t'.join(self.unload_options)
73
ValueError: too many values to unpack
{code}
This is occurred by unmatch between the number of variables and return values.
As AwsHook.get_credentials' docstring says, it returns three values:
{code}
def get_credentials(self, region_name=None):
"""Get the underlying `botocore.Credentials` object.
This contains the attributes: access_key, secret_key and token.
"""
{code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)