vschettino opened a new issue #18276: URL: https://github.com/apache/airflow/issues/18276
### Apache Airflow version 2.0.2 ### Operating System Centos 8 ### Versions of Apache Airflow Providers apache-airflow-providers-amazon==2.2.0 apache-airflow-providers-ftp==2.0.1 apache-airflow-providers-http==2.0.1 apache-airflow-providers-imap==2.0.1 apache-airflow-providers-sqlite==2.0.1 ### Deployment MWAA ### Deployment details The problem happens both on a local docker container and MWAA. ### What happened I am trying to set a user for `aws_default` connection in order to be able to run Fargate Tasks and access AWS Parameter Store during DAG execution. According to the [docs](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/connections/aws.html): > The default connection ID is aws_default. If the environment/machine where you are running Airflow has the file credentials in /home/.aws/, and the default connection has user and pass fields empty, it will take automatically the credentials from there. What happens is that no matter how I set up those values, the default connection is always empty:? ``` [2021-09-15 16:04:06,915] {{standard_task_runner.py:77}} INFO - Job 16: Subtask data-mwaa-lab [2021-09-15 16:04:07,060] {{logging_mixin.py:104}} INFO - [2021-09-15 16:04:07,060] {{base_aws.py:368}} INFO - Airflow Connection: aws_conn_id=aws_default [2021-09-15 16:04:07,111] {{logging_mixin.py:104}} INFO - [2021-09-15 16:04:07,111] {{base_aws.py:179}} INFO - No credentials retrieved from Connection [2021-09-15 16:04:07,139] {{logging_mixin.py:104}} INFO - [2021-09-15 16:04:07,139] {{base_aws.py:87}} INFO - Creating session with aws_access_key_id=None region_name=us-east-2 [2021-09-15 16:04:07,179] {{logging_mixin.py:104}} INFO - [2021-09-15 16:04:07,179] {{base_aws.py:157}} INFO - role_arn is None ``` ### What you expected to happen I understand that setting Login and Password on the UI (`connection/edit/2`) those values should be used instead of reaching for the `/home/.aws/` file, that will be empty. When I create a identical connection called `aws`, the fargate task is correctly launched. The problem here is that I need this to be the default connection to reach for Parameter Store values in order to configure the dag run. ### How to reproduce I was able to reproduce the same issue using a docker container on my machine and on an old Airflow 1.x instance. Just setting valid AWS credentials via UI on Login/Pass of connection `aws_default` do not work. ### Anything else _No response_ ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
