xelita opened a new issue #19962:
URL: https://github.com/apache/airflow/issues/19962
### Apache Airflow version
2.2.2 (latest released)
### Operating System
Ubuntu 18.04.4 LTS
### Versions of Apache Airflow Providers
Providers info
apache-airflow-providers-amazon | 2.4.0
apache-airflow-providers-celery | 2.1.0
apache-airflow-providers-cncf-kubernetes | 2.1.0
apache-airflow-providers-docker | 2.3.0
apache-airflow-providers-elasticsearch | 2.1.0
apache-airflow-providers-ftp | 2.0.1
apache-airflow-providers-google | 6.1.0
apache-airflow-providers-grpc | 2.0.1
apache-airflow-providers-hashicorp | 2.1.1
apache-airflow-providers-http | 2.0.1
apache-airflow-providers-imap | 2.0.1
apache-airflow-providers-microsoft-azure | 3.3.0
apache-airflow-providers-mysql | 2.1.1
apache-airflow-providers-odbc | 2.0.1
apache-airflow-providers-postgres | 2.3.0
apache-airflow-providers-redis | 2.0.1
apache-airflow-providers-sendgrid | 2.0.1
apache-airflow-providers-sftp | 2.2.0
apache-airflow-providers-slack | 4.1.0
apache-airflow-providers-sqlite | 2.0.1
apache-airflow-providers-ssh | 2.3.0
### Deployment
Docker-Compose
### Deployment details
### structure of my local directory:
<pre>
.
├── airflow.sh
├── config
│ └── log_config.py
├── dags
│ └── test.py
├── docker-compose.yaml
├── logs
├── Makefile
├── parameters.json
├── plugins
└── test.py
</pre>
### docker-compose.yml
only changes I have made in common env section:
- added PYTONPATH env variable
- mounting config folder were my log customization code lives
<pre>
environment:
&airflow-common-env
PYTHONPATH: /opt/airflow/config
AIRFLOW__LOGGING__LOGGING_CONFIG_CLASS: 'log_config.LOGGING_CONFIG'
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
AIRFLOW__CORE__SQL_ALCHEMY_CONN:
postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND:
db+postgresql://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
AIRFLOW__CORE__FERNET_KEY: ''
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'false'
AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
AIRFLOW__API__AUTH_BACKEND: 'airflow.api.auth.backend.basic_auth'
_PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./plugins:/opt/airflow/plugins
- ./config:/opt/airflow/config
user: "${AIRFLOW_UID:-50000}:0"
depends_on:
&airflow-common-depends-on
redis:
condition: service_healthy
postgres:
condition: service_healthy
</pre>
log_config.py (nothing special for now):
<pre>
from copy import deepcopy
from airflow.config_templates.airflow_local_settings import
DEFAULT_LOGGING_CONFIG
LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)
</pre>
### What happened
When executing `docker-compose up` command, the program exists with the
following errors:
<pre>
airflow_postgres_1 is up-to-date
airflow_redis_1 is up-to-date
Recreating airflow_airflow-init_1 ... done
ERROR: for airflow-scheduler Container "2b3fbb5a9e97" exited with code 1.
ERROR: for airflow-webserver Container "2b3fbb5a9e97" exited with code 1.
ERROR: for airflow-worker Container "2b3fbb5a9e97" exited with code 1.
ERROR: for flower Container "2b3fbb5a9e97" exited with code 1.
ERROR: for airflow-triggerer Container "2b3fbb5a9e97" exited with code 1.
ERROR: Encountered errors while bringing up the project.
</pre>
when looking at the error using docker logs command:
<pre>
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/logging_config.py",
line 41, in configure_logging
logging_config = import_string(logging_class_path)
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/module_loading.py",
line 32, in import_string
module = import_module(module_path)
File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in
import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
File "<frozen importlib._bootstrap>", line 983, in _find_and_load
File "<frozen importlib._bootstrap>", line 965, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'log_config'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 5, in <module>
from airflow.__main__ import main
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/__init__.py", line
46, in <module>
settings.initialize()
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/settings.py", line
483, in initialize
LOGGING_CLASS_PATH = configure_logging()
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/logging_config.py",
line 50, in configure_logging
raise ImportError(f'Unable to load custom logging from
{logging_class_path} due to {err}')
ImportError: Unable to load custom logging from log_config.LOGGING_CONFIG
due to No module named 'log_config'
ERROR!!!: Too old Airflow version !
The minimum Airflow version supported: 2.2.0. Only use this or higher!
</pre>
### What you expected to happen
Being able to use my custom logger definition.
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]