Limess opened a new issue, #29537: URL: https://github.com/apache/airflow/issues/29537
### Apache Airflow version Other Airflow 2 version (please specify below) ### What happened Using Airflow `2.3.4` We removed any config values we did not explicitly set from `airflow.cfg`. This was to make future upgrades less involved, as we could only compare configuration values we explicitly set, rather than all permutations of versions. This has been [recommended in slack](https://apache-airflow.slack.com/archives/CCQB40SQJ/p1668441275427859?thread_ts=1668394200.637899&cid=CCQB40SQJ) as an approach. e.g. we set `AIRFLOW__CELERY__BROKER_URL` as an environment variable - we do not set this in `airflow.cfg`, so we removed the `[celery]` section from the Airflow configuration. We set `AIRFLOW__CORE__EXECUTOR=CeleryExecutor`, so we are using the Celery executor. Upon starting the Airflow scheduler, it exited with code `1`, and this message: ``` The section [celery] is not found in config. ``` Upon adding back in an empty ``` [celery] ``` section to `airflow.cfg`, this error went away. I have verified that it still picks up `AIRFLOW__CELERY__BROKER_URL` correctly. ### What you think should happen instead I'd expect Airflow to take defaults as listed [here](https://airflow.apache.org/docs/apache-airflow/2.3.4/howto/set-config.html) ### How to reproduce 1. Setup a docker image for the Airflow `scheduler` with `apache/airflow:slim-2.3.4)-python3.10` and the following configuration in `airflow.cfg`: ``` [core] # The executor class that airflow should use. Choices include # ``SequentialExecutor``, ``LocalExecutor``, ``CeleryExecutor``, ``DaskExecutor``, # ``KubernetesExecutor``, ``CeleryKubernetesExecutor`` or the # full import path to the class when using a custom executor. executor = CeleryExecutor [logging] [metrics] [secrets] [cli] [debug] [api] [lineage] [atlas] [operators] [hive] [webserver] [email] [smtp] [sentry] [celery_kubernetes_executor] [celery_broker_transport_options] [dask] [scheduler] [triggerer] [kerberos] [github_enterprise] [elasticsearch] [elasticsearch_configs] [kubernetes] [smart_sensor] ``` 2. Run the `scheduler` command, also setting `AIRFLOW__CELERY__BROKER_URL` to point to a Celery redis broker. 3. Observe that the scheduler exits. ### Operating System Ubuntu 20.04.5 LTS (Focal Fossa) ### Versions of Apache Airflow Providers _No response_ ### Deployment Other Docker-based deployment ### Deployment details AWS ECS Docker `apache/airflow:slim-2.3.4)-python3.10` Separate: - Webserver - Triggerer - Scheduler - Celery worker - Celery flower services ### Anything else This seems to occur due to this check in the Airflow image entrypoint: https://github.com/apache/airflow/blob/28126c12fbdd2cac84e0fbcf2212154085aa5ed9/scripts/docker/entrypoint_prod.sh#L203-L212 ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
