jablecherman commented on issue #51316:
URL: https://github.com/apache/airflow/issues/51316#issuecomment-2932327875

   @karenbraganz 
   
   I saw those docs and gave that a try as well but it did not fix the problem. 
For me, I get the same error with the below. Let me know if you can reproduce 
or if you think I'm doing something wrong here. Reminder, I am using 
`3.0.0+astro.2`.
   
   ```
   # test_dag.py
   from airflow.models.dag import DAG
   from airflow.providers.standard.operators.python import PythonOperator
   from airflow.sdk.definitions.connection import Connection
   
   MSSQL_CONN_ID = "s3_internal"
   
   with DAG(
       dag_id="test_dag",
       catchup=False,
   ) as dag:
       PythonOperator(
           task_id = "foo",
           python_callable=Connection.get,
           op_args=[MSSQL_CONN_ID]
       )
   
   if __name__ == "__main__":
       dag.test(conn_file_path="connections.yaml")
   ```
   
   ```
   # connections.yaml
   s3_internal:
     conn_type: amazon
     conn_description:
     conn_login: *****
     conn_password: *****
     conn_id: s3_internal
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to