GitHub user Dhunsheth created a discussion: Airflow 3.1.0 Advanced Logging and 
Log Supression

I have configured the Airflow backend db to a PostgreSQL database, connected my 
secrets backend, and have enabled remote logging to cloudwatch on AWS. In 
addition, I have monitoring enabled on the airflow logs and get an email via 
SNS anytime an error or warning is detected in the logs. 

**The Issue**
In my airflow logs, there are warnings from loggers I don't care about that 
trigger an email (with a cost implication) that I am trying to suppress. There 
are also info level logs that repeatedly appear that I want to silence in 
production environments to minimize noise in the logs so problem resolution is 
easier. 

However, my implementation of the recommended approach is not suppressing the 
logs and I am still seeing the log examples below appear in cloudwatch and I'm 
not sure why. I can see the user-defined logging config is imported 
successfully for scheduler and triggerer.
`
INFO - Detected user-defined logging config. Attempting to load 
log_config.LOGGING_CONFIG
INFO - Successfully imported user-defined logging config. FastAPI App will 
serve log from /home/airflow
`

**Example logs to be suppressed**
"logger": "py.warnings",
"event": "The `airflow.hooks.base.BaseHook` attribute is deprecated. Please use 
`'airflow.sdk.bases.hook.BaseHook'`.",
"level": "warning"

"logger": "botocore.credentials",
"event": "Found credentials from IAM Role",
"level": "info"

"logger": "airflow.models.dagbag.DagBag",
"event": "Filling up the DagBag from..."
"level": "info"

**Current solution that isn't working**
Based on the Airflow advanced logging documentation: 
https://airflow.apache.org/docs/apache-airflow/stable/administration-and-deployment/logging-monitoring/advanced-logging-configuration.html

I created a config directory under airflow so the project structure follows: 
/airflow/config/log_config.py

In my .env file to override default airflow variables I have set:
`
PYTHONPATH=/airflow/config
AIRFLOW__LOGGING__LOGGING_CONFIG_CLASS=log_config.LOGGING_CONFIG
`

My log_config.py file is as follows:

`
from copy import deepcopy
from pydantic.utils import deep_update
from airflow.config_templates.airflow_local_settings import 
DEFAULT_LOGGING_CONFIG, REMOTE_TASK_LOG

LOGGING_CONFIG = deep_update(
    deepcopy(DEFAULT_LOGGING_CONFIG),
    {
        "loggers": {
            "botocore.credentials": {
                "handlers": [],
                "level": "WARNING",
                "propagate": False
            },
            "airflow.models.dagbag.DagBag": {
                "handlers": [],
                "level": "WARNING",
                "propagate": False
            },
            "py.warnings": {
                "handlers": [],
                "level": "ERROR",
                "propagate": False
            }
        }
    }
)

REMOTE_TASK_LOG = REMOTE_TASK_LOG
`

**Environment Setup**
I am running airflow 3.1.0 using python 3.12 and the following providers:

"apache-airflow==3.1.0" \
"apache-airflow-providers-microsoft-mssql" \
"apache-airflow-providers-amazon" \
"flask_appbuilder" \
"psycopg2-binary" \
"asyncpg" \
--constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-3.1.0/constraints-3.12.txt";


GitHub link: https://github.com/apache/airflow/discussions/56377

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to