throwawayaccount11111 opened a new issue, #31947:
URL: https://github.com/apache/airflow/issues/31947

   ### Apache Airflow version
   
   2.6.1
   
   ### What happened
   
   We are running into an issue where metrics in Grafana show executor running 
tasks not getting cleared. It results in Airflow progressively slowing down 
where we need to restart scheduler every 2-3 weeks and things go back to 
normal. However, before I open that bug, I wanted to make sure that it is not 
something related to a DAG implementation. And I would like to get one specific 
debug log out of `kuberenetes_exector.py` that logs the list of tasks running:
   
   ```
              self.log.debug("self.running: %s", self.running)
   ```
   
   That code is located 
[here](https://github.com/apache/airflow/blob/2.5.3/airflow/executors/kubernetes_executor.py#L610).
 Since I use `LocalExecutor`, I tried using another class as well. (Please take 
a look at the logging configuration snippet below).
   
   Long story short, I am unable to configure debug logging only for these 
classes. The global log level setting of `INFO` is being applied. Unless the 
global setting is changed to `DEBUG`, debug logs are not outputted. And setting 
global level to `DEBUG` is bit tricky because that ends up generating too many 
logs in production.
   
   The log configuration is:
   
   ```python
   from copy import deepcopy
   
   # import the default logging configuration
   from airflow.config_templates.airflow_local_settings import 
DEFAULT_LOGGING_CONFIG
   
   LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)
   
   LOGGING_CONFIG['loggers']['airflow_executors_kubernetes_executor'] = {
       "class": "airflow.executors.kubernetes_executor",
       "handlers": ["console"],
       "level": 'DEBUG',
       "propagate": True,
   }
   
   LOGGING_CONFIG['loggers']['airflow_jobs_scheduler_job'] = {
       "class": "airflow.jobs.scheduler_job",
       "handlers": ["console"],
       "level": 'DEBUG',
       "propagate": True,
   }
   
   ```
   
   I have validated that the log configuration is loaded correctly by looking 
at the output of cli command `airflow config` 
   
   ```
   [logging]
   ...
   logging_config_class = config.log_config.LOGGING_CONFIG
   ...
   ```
   
   I can also see the following line in the logs:
   ```
   Successfully imported user-defined logging config from 
config.log_config.LOGGING_CONFIG
   ```
   
   
   
   ### What you think should happen instead
   
   The `DEBUG` logs from the specified class in the logging configuration 
should be outputted rather than the global setting being applied.
   
   ### How to reproduce
   
   Configure custom logging. This setup may vary for everyone. I set it up in a 
dockerfile like this:
   
   ```
   FROM apache/airflow:slim-2.5.3-python3.9
   
   COPY requirements.txt ./
   
   RUN pip install -r requirements.txt --no-cache-dir 
--use-deprecated=legacy-resolver
   
   USER root
   
   RUN sudo apt-get update && apt-get install sudo && apt-get install net-tools 
&& apt-get -y install smbclient
   
   
   RUN chmod -R 777 /opt/airflow/decrypt && chown -R airflow /opt/airflow/
   
   USER airflow
   
   ENV PYTHONPATH=/opt/airflow/dags
   
   #The following is required for custom loggers.
   RUN mkdir $PYTHONPATH/config
   COPY log_config.py $PYTHONPATH/config/
   RUN touch $PYTHONPATH/config/__init__.py
   ENV AIRFLOW__LOGGING__LOGGING_CONFIG_CLASS=config.log_config.LOGGING_CONFIG
   
   ```
   
   ### Operating System
   
   MacOS Ventura 13.1
   
   ### Versions of Apache Airflow Providers
   
   The `requirements.txt` file used in the above dockerfile
   
   # This file contains the dependencies required by DAGs and is used to build 
the Airflow docker image.
   
   # Alphabetical
   
   aiofiles==23.1.0
   aiohttp==3.8.4
   airflow-dbt>=0.4.0
   airflow-exporter==1.5.3
   anytree==2.8.0
   apache-airflow-providers-ftp==2.0.1
   apache-airflow-providers-http>=2.0.3
   apache-airflow-providers-microsoft-mssql==2.1.3
   apache-airflow-providers-snowflake>=4.0.4
   apache-airflow-providers-hashicorp==3.3.0
   apache-airflow-providers-cncf-kubernetes==5.2.2
   apache-airflow>=2.2.3
   asgiref==3.5.0
   Authlib==0.15.5
   dbt-snowflake==1.2.0
   flatdict==4.0.1
   hvac==0.11.2
   jsonschema>=4.17.3
   pandas==1.3.5
   psycopg2-binary==2.9.3
   pyOpenSSL==23.1.1
   pysftp==0.2.9
   pysmbclient==0.1.5
   python-gnupg==0.5.0
   PyYAML~=5.4.1
   requests~=2.26.0
   smbprotocol==1.9.0
   snowflake-connector-python== 3.0.2
   snowflake-sqlalchemy==1.4.7
   statsd==3.3.0
   
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   I am using Airflow provided docker-compose file. It can be found 
[here](https://airflow.apache.org/docs/apache-airflow/stable/docker-compose.yaml)
   
   The Airflow version I am using is 2.5.3.
   
   ### Anything else
   
   Every time.
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to