jonstacks opened a new issue #21469:
URL: https://github.com/apache/airflow/issues/21469
### Official Helm Chart version
1.4.0 (latest released)
### Apache Airflow version
2.2.3 (latest released)
### Kubernetes Version
v1.21.5
### Helm Chart configuration
```yaml
cleanup:
enabled: true
airflowPodAnnotations:
vault.hashicorp.com/agent-init-first: "true"
vault.hashicorp.com/agent-inject: "false"
vault.hashicorp.com/agent-run-as-user: "50000"
vault.hashicorp.com/agent-pre-populate-only: "true"
vault.hashicorp.com/agent-inject-status: "update"
```
### Docker Image customisations
We have customized the `ENTRYPOINT` for exporting some environment variables
that get loaded from Hashicorp's vault.
The entrypoint line in the Dockerfile:
```Dockerfile
ENTRYPOINT ["/usr/bin/dumb-init", "--", "/opt/airflow/entrypoint.sh"]
```
The last line in the `/opt/airflow/entrypoint.sh` script:
```bash
# Call Airflow's default entrypoint after we source the vault secrets
exec /entrypoint "${@}"
```
### What happened
Install was successful and the `webserver` and `scheduler` pods are working
as expected. The `cleanup` pods launched from the `cleanup` cronjob fail:
```
No vault secrets detected
....................
ERROR! Maximum number of retries (20) reached.
Last check result:
$ airflow db check
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 5, in <module>
from airflow.__main__ import main
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/__init__.py", line
34, in <module>
from airflow import settings
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/settings.py", line
35, in <module>
from airflow.configuration import AIRFLOW_HOME, WEBSERVER_CONFIG, conf
# NOQA F401
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py",
line 1129, in <module>
conf.validate()
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py",
line 224, in validate
self._validate_config_dependencies()
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py",
line 267, in _validate_config_dependencies
raise AirflowConfigException(f"error: cannot use sqlite with the
{self.get('core', 'executor')}")
airflow.exceptions.AirflowConfigException: error: cannot use sqlite with the
KubernetesExecutor
```
### What you expected to happen
It looks like the annotations on the `cleanup` cronjob are static and only
contain an istio annotation
https://github.com/apache/airflow/blob/c28c255e52255ea2060c1a802ec34f9e09cc4f52/chart/templates/cleanup/cleanup-cronjob.yaml#L56-L60
From the documentation in the values.yaml. I would expect the `cleanup`
cronjob to have these annotations:
https://github.com/apache/airflow/blob/c28c255e52255ea2060c1a802ec34f9e09cc4f52/chart/values.yaml#L187-L189
### How to reproduce
From the root of the airflow repository:
```bash
cd chart
helm dep build
helm template . --set cleanup.enabled=true --set
airflowPodAnnotations."my\.test"="somevalue" -s
templates/cleanup/cleanup-cronjob.yaml
```
If you look at the annotations section of the output, you will only see the
static `istio` annotation that is hard coded.
### Anything else
This could be a potentially breaking change even though the documentation
says they should get applied to all Airflow pods. Another option would be to
add `cleanup.podAnnotations` section for supplying them if fixing it by adding
the global annotations would not work.
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]