santurini opened a new issue, #51874:
URL: https://github.com/apache/airflow/issues/51874

   ### Official Helm Chart version
   
   1.16.0 (latest released)
   
   ### Apache Airflow version
   
   2.9.2
   
   ### Kubernetes Version
   
   1.31.2
   
   ### Helm Chart configuration
   
   ```
   useStandardNaming: false
   uid: 50000
   gid: 0
   securityContext: {}
   images:
     airflow:
       repository: ${IMAGE_NAME}
       tag: ${VERSION}
       digest: ~
       pullPolicy: IfNotPresent
     useDefaultImageForMigration: false
     migrationsWaitTimeout: 60
     pod_template:
       pullPolicy: IfNotPresent
     flower:
       pullPolicy: IfNotPresent
     statsd:
       repository: quay.io/prometheus/statsd-exporter
       tag: v0.26.1
       pullPolicy: IfNotPresent
     redis:
       repository: redis
       tag: 7.2-bookworm
       pullPolicy: IfNotPresent
     pgbouncer:
       repository: apache/airflow
       tag: airflow-pgbouncer-2024.01.19-1.21.0
       pullPolicy: IfNotPresent
     pgbouncerExporter:
       repository: apache/airflow
       tag: airflow-pgbouncer-exporter-2024.01.19-0.16.0
       pullPolicy: IfNotPresent
     gitSync:
       repository: registry.k8s.io/git-sync/git-sync
       tag: v4.1.0
       pullPolicy: IfNotPresent
   
   env: 
   - name: "AIRFLOW__API__AUTH_BACKEND"
     value: "airflow.api.auth.backend.basic_auth"
   - name: "AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION"
     value: "True"
   - name: "AIRFLOW__CORE__LOAD_EXAMPLES"
     value: "False"
   - name: AIRFLOW__CORE__EXECUTE_TASKS_NEW_PYTHON_INTERPRETER
     value: "True"
   - name: AIRFLOW__LOGGING__REMOTE_LOGGING
     value: "True"
   - name: AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID
     value: "LoggingS3Connection"
   - name: AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER
     value: "s3://logs-archive-product-team/Airflow"
   - name: AIRFLOW__LOGGING__ENCRYPT_S3_LOGS
     value: "False"
   - name: AIRFLOW__CORE__ENABLE_XCOM_PICKLING
     value: "True"
   - name: AIRFLOW__SMTP__SMTP_HOST
     value: "smtp.gmail.com"
   - name: AIRFLOW__SMTP__SMTP_PORT
     value: "587"
   - name: AIRFLOW__SMTP__SMTP_SSL
     value: "False"
   - name: AIRFLOW__SMTP__SMTP_STARTTLS
     value: "True"
   
   secret: 
   - envName: "GIT_TOKEN"
     secretName: "airflow-ssh-secret"
     secretKey: "GIT_TOKEN"
   - envName: "AWS_ACCESS_KEY_ID"
     secretName: "airflow-aws"
     secretKey: "AWS_ACCESS_KEY_ID"
   - envName: "AWS_SECRET_ACCESS_KEY"
     secretName: "airflow-aws"
     secretKey: "AWS_SECRET_ACCESS_KEY"
   - envName: "AWS_ACCESS_KEY_ID_DATASETS"
     secretName: "airflow-aws"
     secretKey: "AWS_ACCESS_KEY_ID_DATASETS"
   - envName: "AWS_SECRET_ACCESS_KEY_DATASETS"
     secretName: "airflow-aws"
     secretKey: "AWS_SECRET_ACCESS_KEY_DATASETS"
   - envName: "FRONT_TOKEN"
     secretName: "airflow-connections"
     secretKey: "FRONT_TOKEN"
   - envName: "ASANA_TOKEN"
     secretName: "airflow-connections"
     secretKey: "ASANA_TOKEN"
   - envName: "SLACK_TOKEN"
     secretName: "airflow-connections"
     secretKey: "SLACK_TOKEN"
   - envName: "AIRFLOW__SMTP__SMTP_PASSWORD"
     secretName: "airflow-connections"
     secretKey: "AIRFLOW__SMTP__SMTP_PASSWORD"
   - envName: "AWS_BUCKET_MODEL"
     secretName: "airflow-aws"
     secretKey: "AWS_BUCKET_MODEL"
   - envName: "AWS_REGION_NAME"
     secretName: "airflow-aws"
     secretKey: "AWS_REGION_NAME"
   - envName: "REDIS_PASSWORD"
     secretName: "airflow-connections"
     secretKey: "REDIS_PASSWORD"
   
   enableBuiltInSecretEnvVars:
     AIRFLOW__CORE__FERNET_KEY: true
     AIRFLOW__CORE__SQL_ALCHEMY_CONN: true
     AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: true
     AIRFLOW_CONN_AIRFLOW_DB: true
     AIRFLOW__WEBSERVER__SECRET_KEY: true
     AIRFLOW__CELERY__CELERY_RESULT_BACKEND: true
     AIRFLOW__CELERY__RESULT_BACKEND: true
     AIRFLOW__CELERY__BROKER_URL: true
     AIRFLOW__ELASTICSEARCH__HOST: true
     AIRFLOW__ELASTICSEARCH__ELASTICSEARCH_HOST: true
   
   data:
     metadataConnection:
       user: airflow
       pass: ${AIRFLOW_DB_PASSWORD}
       protocol: postgresql
       host: ds-postgres-prod.pippo.com
       port: 5432
       db: airflow
       sslmode: disable
     resultBackendConnection: ~
     brokerUrl: redis://pippo:${REDIS_PASSWORD}@ds-nosql-prod.pippo.com:6379/6
   
   # Fernet key settings
   # Note: fernetKey can only be set during install, not upgrade
   fernetKey: ~
   fernetKeySecretName: ~
   ```
   
   ### Docker Image customizations
   
   ```
   FROM apache/airflow:2.9.2-python3.11
   
   ARG GIT_TOKEN
   ENV GIT_TOKEN=$GIT_TOKEN
   
   USER root
   RUN apt-get update && apt-get install -y git
   RUN apt-get install g++ -y
   
   USER $AIRFLOW_UID
   ```
   
   ### What happened
   
   I had to recreate the kubernetes cluster and I thought that using the same 
PostgreSQL for the new installation would have restored the previous state, 
instead the dag imports failed because the encrypted variables are seen as 
missing.
   
   I tried to do more testing by creating a new Variable and a very simple DAG 
that prints out the new defined encrypted variable. In this case I have no dag 
import error but it fails like this:
   ```
   [2025-06-18, 07:58:58 UTC] {variable.py:80} ERROR - Can't decrypt _val for 
key=SAMPLE_VARIABLE, invalid token or value
   [2025-06-18, 07:58:58 UTC] {simple_variable_reader.py:33} ERROR - Error 
reading variable: 'Variable SAMPLE_VARIABLE does not exist'
   ```
   
   While for the old encrypted variables and connections, we had to manually 
turn off the encryption and update the value from inside Postgre to solve the 
import errors.
   
   ### What you think should happen instead
   
   The PostgreSQL should store all the necessary information to re-establish 
the previous state in case of new installations
   
   ### How to reproduce
   
   Create an airflow namespace and installation connected to a PostgreSQL with 
encrypted variables.
   Delete the namespace and all its resources.
   Create a new airflow installation connected to the same PostgreSQL
   
   ### Anything else
   
   How can I solve this problem and recover the variables and connections?
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to