bmanan7 opened a new issue, #58313:
URL: https://github.com/apache/airflow/issues/58313

   ### Apache Airflow version
   
   3.1.2
   
   ### If "Other Airflow 2/3 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   When retrieving a connection from AWS Secrets Manager(programmatically) that 
does not include conn_type, Airflow 3 raises:
   ```
   AirflowNotFoundException: The conn_id `dev_sec` isn't defined
   ```
   However, running:
   ```
   airflow connections get dev_sec -o yaml
   ```
   returns the expected connection details, showing that the secret is found 
and accessible via the CLI - only the programmatic lookup fails.
   
   Also, In Airflow 2 the same secret is retrieved 
successfully(programmatically) but with type=None.
   
   ### What you think should happen instead?
   
   Airflow 3 should maintain backward compatibility and allow retrieving 
connections without explicitly defining `conn_type`, defaulting to None or 
similar (consistent with Airflow 2 behavior).
   
   ### How to reproduce
   
   1. Create an AWS secret named airflow/connections/dev_sec with the following 
JSON:
   ```
   {
     "username": "tuser",
     "password": "tpass"
   }
   ```
   (no conn_type field)
   
   2. Configure Airflow to use SecretsManagerBackend with proper AWS 
credentials. For example: 
   ```
   AIRFLOW__SECRETS__BACKEND_KWARGS='{"connections_prefix": 
"airflow/connections","variables_prefix": "airflow/variables","region_name": 
"us-east-2"}'
   
AIRFLOW__SECRETS__BACKEND=airflow.providers.amazon.aws.secrets.secrets_manager.SecretsManagerBackend
   AWS_DEFAULT_REGION=us-east-2
   AWS_ACCESS_KEY_ID=XXXXXX
   AWS_SECRET_ACCESS_KEY=XXXX
   ```
   
   3. In a DAG, try to access the connection:
   ```
   import logging
   from airflow import DAG
   from airflow.operators.python import PythonOperator
   from airflow.utils.timezone import datetime
   from airflow.hooks.base import BaseHook
   from airflow.models.connection import Connection 
   
   CONN_ID = "dev_sec"
   
   def _print_conn():
       conn: Connection = BaseHook.get_connection(CONN_ID)
       logging.info(
           "Got conn: type=%s host=%s port=%s schema=%s extras=%s",
           conn.conn_type, conn.host, conn.port, conn.schema, 
list(conn.extra_dejson.keys())
       )
       print(f"TYPE={conn.conn_type}")
       print(f"HOST={conn.host}")
       print(f"PORT={conn.port}")
       return "printed"
   
   with DAG(
       dag_id="print_conn_via_basehook",
       start_date=datetime(2024, 1, 1),
       schedule=None,
       catchup=False,
       tags=["secrets", "aws", "basehook"],
   ) as dag:
       PythonOperator(task_id="print_conn", python_callable=_print_conn)
   ```
   4. Observe the error on Airflow 3:
   ```
   AirflowNotFoundException: The conn_id `dev_sec` isn't defined
   ```
   5. Run the same on Airflow 2 - it succeeds but sets type=None.
   
   ### Operating System
   
   macOS 15.6.1
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   When `conn_type: aws` is added to the secret JSON, Airflow 3 succeeds:
   {
     "conn_type": "aws",
     "username": "tuser",
     "password": "tpass"
   }
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to