JonnyWaffles opened a new issue, #34612:
URL: https://github.com/apache/airflow/issues/34612

   ### Apache Airflow version
   
   2.7.1
   
   ### What happened
   
   Hi team,
   
   In the past you can use `AIRFLOW__CELERY__BROKER_URL_SECRET` as a way to 
retrieve the credentials from a `SecretBackend` at runtime. However, as of 
Airflow 2.7, this technique appears to be broken. I believe this related to the 
discussion [34030 - Celery configuration elements not shown to be fetched with 
_CMD pattern](https://github.com/apache/airflow/discussions/34030). The irony 
is the pattern works when using the `config get-value` command, but does not 
work when using the actual `airflow celery command`. I suspect this has 
something to do with when the wrapper calls 
`ProvidersManager().initialize_providers_configuration()`.
   
   ```cmd
   unset AIRFLOW__CELERY__BROKER_URL
   AIRFLOW__CELERY__BROKER_URL_SECRET=broker_url_east airflow config get-value 
celery broker_url
   ``` 
   This correct prints the secret from the backend!
   ```
   redis://:<long password>@<my url>:6379/1
   
   However actually executing celery with the same methodolgy results in the 
default Redis
   ```cmd
   AIRFLOW__CELERY__BROKER_URL_SECRET=broker_url_east airflow celery worker
   ```
   Relevant output
   ```
   - ** ---------- [config]
   - ** ---------- .> app:         
airflow.providers.celery.executors.celery_executor:0x7f4110be1e50
   - ** ---------- .> transport:   redis://redis:6379/0
   ```
   Notice the redis/redis and default port with no password.
   
   ### What you think should happen instead
   
   I would expect the airflow celery command to be able to leverage the 
`_secret` API similar to the `config` command. 
   
   ### How to reproduce
   
   You must use a secret back end to reproduce as described above. You can also 
do
   ```cmd
   AIRFLOW__CELERY__BROKER_URL_CMD='/usr/bin/env bash -c "echo -n ZZZ"' airflow 
celery worker
   ```
   And you will see the ZZZ is disregarded
   ```
   - ** ---------- .> app:         
airflow.providers.celery.executors.celery_executor:0x7f0506d49e20
   - ** ---------- .> transport:   redis://redis:6379/0
   ```
   
   It appears neither historical _CMD or _SECRET APIs work after the refactor 
to move celery to the providers.
   
   ### Operating System
   
   ubuntu20.04
   
   ### Versions of Apache Airflow Providers
   
   Relevant ones
   apache-airflow-providers-celery          3.3.3
   celery                                   5.3.4
   
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   _No response_
   
   ### Anything else
   
   I know this has something to do with when 
`ProvidersManager().initialize_providers_configuration()` is executed but I 
don't know the right place to put the fix.
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to