ashishraman opened a new issue #11649:
URL: https://github.com/apache/airflow/issues/11649


   
   **Apache Airflow version**: 1.10.12
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`): 1.17
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**: AWS EKS
   
   **What happened**:
   When spawning a worker pod, the scheduler injects the sql alchemy string as 
plain text in environment variables
   
   <code>
   Environment:
         AIRFLOW__CORE__DAGS_FOLDER:       /usr/local/airflow/dags/
         AIRFLOW__CORE__EXECUTOR:          LocalExecutor
         AIRFLOW__CORE__SQL_ALCHEMY_CONN:  
postgresql+psycopg2://<<Redacted>>:5432/airflow_db
   </code>
   
   Anyone with a read permission can describe the pod and get the credentials. 
   
   **What you expected to happen**:
   The variable is passed as secret. It can either be manually set by the user 
or by the scheduler.
   
   
   **How to reproduce it**:
   Use KubernetesExecutor with PythonOperator. The new pod spawned by scheduler 
has the env variables AIRFLOW__CORE__SQL_ALCHEMY_CONN in plain text. I tried 
AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD as well but when the scheduler spawns the 
worker pod it sends down AIRFLOW__CORE__SQL_ALCHEMY_CONN in plain text
   
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to