metropolis-ameer opened a new issue #15476:
URL: https://github.com/apache/airflow/issues/15476


   **Apache Airflow version**: 2.0.2, 1.10.15
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`): not used
   
   **Environment**: docker python:3.7-slim-buster
   
   - **Cloud provider or hardware configuration**: container image in docker
   - **OS** (e.g. from /etc/os-release): Debian GNU/Linux 10 (buster)
   - **Kernel** (e.g. `uname -a`): Linux f215c7ef4950 4.19.121-linuxkit #1 SMP 
Thu Jan 21 15:36:34 UTC 2021 x86_64 GNU/Linux
   - **Install tools**: airflow official dockerfile
   - **Others**: 
   
   **What happened**:
   
   I am unable to set sql_alchemy_conn via the environment variable 
`AIRFLOW__CORE__SQL_ALCHEMY_CONN_SECRET` using the secrets backend through AWS 
SystemsManagerParameterStore.  Different versions produce different error 
message. 
   2.0.2 produces the following error message:
   <details><summary>2.0.2 error</summary> Traceback (most recent call last):
     File "/home/airflow/.local/bin/airflow", line 5, in <module>
       from airflow.__main__ import main
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/__init__.py", line 
34, in <module>
       from airflow import settings
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/settings.py", line 
37, in <module>
       from airflow.configuration import AIRFLOW_HOME, WEBSERVER_CONFIG, conf  
# NOQA F401
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 1098, in <module>
       conf = initialize_config()
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 860, in initialize_config
       conf.validate()
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 199, in validate
       self._validate_config_dependencies()
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 227, in _validate_config_dependencies
       is_sqlite = "sqlite" in self.get('core', 'sql_alchemy_conn')
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 328, in get
       option = self._get_environment_variables(deprecated_key, 
deprecated_section, key, section)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 394, in _get_environment_variables
       option = self._get_env_var_option(section, key)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 298, in _get_env_var_option
       return 
_get_config_value_from_secret_backend(os.environ[env_var_secret_path])
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 83, in _get_config_value_from_secret_backend
       secrets_client = get_custom_secret_backend()
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 999, in get_custom_secret_backend
       secrets_backend_cls = conf.getimport(section='secrets', key='backend')
   NameError: name 'conf' is not defined </details>
   
   1.10.15 produces the following error message:
   <details><summary>1.10.15 error</summary> Traceback (most recent call last):
     File "/home/airflow/.local/bin/airflow", line 25, in <module>
       from airflow.configuration import conf
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/__init__.py", line 
31, in <module>
       from airflow.utils.log.logging_mixin import LoggingMixin
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/__init__.py", 
line 24, in <module>
       from .decorators import apply_defaults as _apply_defaults
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/decorators.py", 
line 36, in <module>
       from airflow import settings
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/settings.py", line 
38, in <module>
       from airflow.configuration import conf, AIRFLOW_HOME, WEBSERVER_CONFIG  
# NOQA F401
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 737, in <module>
       conf.read(AIRFLOW_CONFIG)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 425, in read
       self._validate()
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 217, in _validate
       self._validate_config_dependencies()
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", 
line 251, in _validate_config_dependencies
       self.get('core', 'executor')))
   airflow.exceptions.AirflowConfigException: error: cannot use sqlite with the 
CeleryExecutor </details>
   
   **What you expected to happen**:
   
   I expect the variable to be set via secrets backend as documented in 
aws_secrets_manager.py
   
   **How to reproduce it**:
   ```
   docker build . \
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALLATION_METHOD="apache-airflow" \
     --build-arg AIRFLOW_VERSION="2.0.2" \
     --build-arg AIRFLOW_INSTALL_VERSION="==2.0.2" \
     --build-arg AIRFLOW_CONSTRAINTS_REFERENCE="constraints-2.0.2" \
     --build-arg AIRFLOW_BRANCH="v2-0-stable" \
     --build-arg AIRFLOW_SOURCES_FROM="empty" \
     --build-arg AIRFLOW_SOURCES_TO="/empty" \
     --build-arg 
AIRFLOW_EXTRAS="password,apache.presto,amazon,slack,celery,docker,mysql,postgres,redis,crypto,jdbc,ssh,statsd,virtualenv"
 \
     --build-arg ADDITIONAL_PYTHON_DEPS="scp" \
     --tag "company/airflow_2-0-2_base"
   ```
   next docker image:
   have aws credential file available
   
   in secrets:
   set `backend` in airflow.cfg = 
`airflow.contrib.secrets.aws_systems_manager.SystemsManagerParameterStoreBackend`
   set `backend_kwargs` in airflow.cfg = `{"connections_prefix": 
"/airflow/connections", "variables_prefix": "/airflow/variables", 
"config_prefix": "/airflow/config", "profile_name": "default"}`
   
   in aws, create in parameter store: `/airflow/config/sql_alchemy_conn` = 
`postgresql+psycopg2://user:pass@database-host:5432/airflowdb`
   
   create environment variable in `entrypoint.sh`: 
`AIRFLOW__CORE__SQL_ALCHEMY_CONN_SECRET = sql_alchemy_conn`
   
   ```
   FROM company/airflow_2-0-2_base
   
   COPY --chown=airflow:root aws ${AIRFLOW_USER_HOME_DIR}/.aws
   COPY --chown=airflow:root scripts/entrypoint.sh /entrypoint
   COPY --chown=airflow:root config/airflow.cfg ${AIRFLOW_HOME}/airflow.cfg
   
   ENV PYTHONPATH=${AIRFLOW_HOME}/:$PYTHONPATH
   
   EXPOSE 8080 5555 8793
   
   USER ${AIRFLOW_UID}
   WORKDIR ${AIRFLOW_HOME}
   ENTRYPOINT ["/usr/bin/dumb-init", "--", "/entrypoint"]
   ```
   execute: `docker build --rm -t company/airflow .`
   execute: `docker run -it company/airflow bash`
   execute: `airflow config list`
   
   
   **Anything else we need to know**:
   
   tagging @kaxil as requested in slack discussion: 
https://apache-airflow.slack.com/archives/CSS36QQS1/p1618965236306400
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to