GitHub user sreyan32 created a discussion: airflow does not obey 
AIRFLOW__CORE__HIDE_SENSITIVE_VAR_CONN_FIELDS=False and still masks passwords

### Apache Airflow version

Other Airflow 2/3 version (please specify below)

### If "Other Airflow 2/3 version" selected, which one?

2.10.2

### What happened?

I am using the AIRFLOW__CORE__HIDE_SENSITIVE_VAR_CONN_FIELDS = False using an 
-e enviroment variable in an Airflow Podman image.

But I see that this configuration is not respected and I still am getting 
masked passwords in my Airflow task logs.

<img width="718" height="25" alt="Image" 
src="https://github.com/user-attachments/assets/2b6cee7d-5507-451b-990a-292795e71997";
 />

How can I fully disable the masking for the task logs ?

### What you think should happen instead?

The configuration AIRFLOW__CORE__HIDE_SENSITIVE_VAR_CONN_FIELDS = False should 
be respected and all masking should be disabled in the Airflow task logs.

### How to reproduce

Start Airflow with 

`podman run -d --name airflow --network airflow-net --cpus 8 --memory 8192m -e 
AIRFLOW__CORE__EXECUTOR=LocalExecutor -e AIRFLOW_UID=50000 -e 
AIRFLOW__CORE__HIDE_SENSITIVE_VAR_CONN_FIELDS=False -e 
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres:5432/airflow
 -v doc-dbt:/opt/airflow/doc-dbt -v airflow-dags:/opt/airflow/dags -v 
airflow-logs:/opt/airflow/logs -v airflow-plugins:/opt/airflow/plugins -p 
8080:8080 apache/airflow:2.10.5 webserver
`

Create a connection for snowflake_test with a password.

Use the following DAG to test:


```
from airflow import DAG
from airflow.operators.python import PythonOperator
from airflow.utils.dates import days_ago
from airflow.models.connection import Connection
from cryptography.hazmat.primitives import serialization
import base64

# Define the Python function
def python_test():
    # Fetch the connection from Airflow secrets
    conn = Connection.get_connection_from_secrets("snowflake_test")
    
    # Extract extra fields
    extra_dict = conn.extra_dejson
    private_key_content = extra_dict.get("private_key_content")  # safer than 
self._get_field
        
    print(f"private_key_content: {private_key_content}")

    if conn.password:
        print(f"conn.test:{conn.password}")

        passphrase = conn.password.strip().encode()
        
    private_key_pem = base64.b64decode(private_key_content)
        
    
    from typing import Any
    def default_backend() -> Any:
        from cryptography.hazmat.backends.openssl.backend import backend
        return backend

    p_key = serialization.load_pem_private_key(private_key_pem, 
password=passphrase,backend=default_backend())


# Define the DAG
with DAG(
    dag_id="snowflake_private_key_dag",
    start_date=days_ago(1),
    schedule_interval=None,  # Run on demand
    catchup=False,
    tags=["example", "snowflake"]
) as dag:

    python_task = PythonOperator(
        task_id="print_private_key",
        python_callable=python_test
    )

```



### Operating System

Windows (Podman Fedora Core OS)

### Versions of Apache Airflow Providers

_No response_

### Deployment

Other

### Deployment details

Using Podman with Fedora Core OS VM.

`podman run -d --name airflow --network airflow-net --cpus 8 --memory 8192m -e 
AIRFLOW__CORE__EXECUTOR=LocalExecutor -e AIRFLOW_UID=50000 -e 
AIRFLOW__CORE__HIDE_SENSITIVE_VAR_CONN_FIELDS=False -e 
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres:5432/airflow
 -v doc-dbt:/opt/airflow/doc-dbt -v airflow-dags:/opt/airflow/dags -v 
airflow-logs:/opt/airflow/logs -v airflow-plugins:/opt/airflow/plugins -p 
8080:8080 apache/airflow:2.10.2 webserver`


### Anything else?

_No response_

### Are you willing to submit PR?

- [ ] Yes I am willing to submit a PR!

### Code of Conduct

- [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)


GitHub link: https://github.com/apache/airflow/discussions/58456

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to