jon-evergreen opened a new issue #20672:
URL: https://github.com/apache/airflow/issues/20672


   ### Apache Airflow Provider(s)
   
   amazon, postgres
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==2.4.0
   apache-airflow-providers-postgres==2.4.0
   
   ### Apache Airflow version
   
   2.2.3 (latest released)
   
   ### Operating System
   
   Debian GNU/Linux 10 (buster)
   
   ### Deployment
   
   Other Docker-based deployment
   
   ### Deployment details
   
   Docker image based on the official images, with the addition of tooling to 
pull in dags from S3 (objinsync).  provider/airflow portions of the docker 
build are unchanged from the official image
   
   ### What happened
   
   On attempting to get a connection to redshift via iam/GetClusterCredentials 
the following error occurs:
   ```
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/task/task_runner/standard_task_runner.py",
 line 85, in _start_by_fork
       args.func(args, dag=self.dag)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/cli_parser.py", 
line 48, in command
       return func(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/cli.py", line 
92, in wrapper
       return f(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py",
 line 298, in task_run
       _run_task_by_selected_method(args, dag, ti)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py",
 line 107, in _run_task_by_selected_method
       _run_raw_task(args, ti)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/cli/commands/task_command.py",
 line 180, in _run_raw_task
       ti._run_raw_task(
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/session.py", 
line 70, in wrapper
       return func(*args, session=session, **kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py",
 line 1329, in _run_raw_task
       self._execute_task_with_callbacks(context)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py",
 line 1455, in _execute_task_with_callbacks
       result = self._execute_task(context, self.task)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py",
 line 1511, in _execute_task
       result = execute_callable(context=context)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/postgres/operators/postgres.py",
 line 69, in execute
       self.hook.run(self.sql, self.autocommit, parameters=self.parameters)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/hooks/dbapi.py", line 
198, in run
       with closing(self.get_conn()) as conn:
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/postgres/hooks/postgres.py",
 line 92, in get_conn
       conn.login, conn.password, conn.port = self.get_iam_token(conn)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/postgres/hooks/postgres.py",
 line 195, in get_iam_token
       session, endpoint_url = aws_hook._get_credentials()
   TypeError: _get_credentials() missing 1 required positional argument: 
'region_name'
   ```
   
   ### What you expected to happen
   
   It should just connect to redshift, as it did previously.
   
   It seems like the relevant code for the operator has changed in the 
`v2-2-stable` branch, but this has not been reflected in a release of the 
provider package.  I can't verify at this time if the changed provider code 
works, but it has at least changed!
   
   Also, it seems that the aws base hook `_get_credentials()` method 
([here](https://github.com/apache/airflow/blob/06c82e17e9d7ff1bf261357e84c6013ccdb3c241/airflow/providers/amazon/aws/hooks/base_aws.py#L395))
 has the `region_name` parameter annotated with `Optional[str]`, it should be 
`region_name: Optional[str] = None`, (i.e. with a default specified too) as 
type annotations are purely advisory.
   
   ### How to reproduce
   
   Try and use a redshift connection via the postgres operator and IAM.
   
   ### Anything else
   
   It fails every time its used
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to