pierrejeambrun commented on code in PR #60818:
URL: https://github.com/apache/airflow/pull/60818#discussion_r2829116260


##########
airflow-core/docs/administration-and-deployment/logging-monitoring/advanced-logging-configuration.rst:
##########
@@ -68,9 +68,11 @@ Follow the steps below to enable custom logging config class:
     .. code-block:: python
 
       from copy import deepcopy
-      from airflow.config_templates.airflow_local_settings import 
DEFAULT_LOGGING_CONFIG
 
-      LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)
+    # Airflow 3.x and later uses structlog for logging configuration.
+    # The legacy DEFAULT_LOGGING_CONFIG dict is no longer respected.
+    # See the migration note below for details.
+    # Example: To customize logging, use structlog configuration as described 
in the Airflow documentation.

Review Comment:
   Doesn't seem related to the PR



##########
airflow-core/docs/core-concepts/backfill.rst:
##########
@@ -47,6 +47,24 @@ Run ordering
 
 You can run your backfill in reverse, i.e. latest runs first.  The CLI option 
is ``--run-backwards``.
 
+Running backfills on paused DAGs
+---------------------------------
+
+Airflow allows backfills to run on paused DAGs. This is useful when:
+
+* You need to reprocess historical data without activating the regular DAG 
schedule
+* The data source is not available for regular runs
+* You want to avoid unnecessary computation and logs from scheduled triggers
+
+When you create a backfill on a paused DAG, only the backfill runs will 
execute—the DAG remains paused and will not be triggered by its regular 
schedule.
+
+**Via UI**: When creating a backfill for a paused DAG, you can:
+
+- **Check "Unpause on trigger"**: This will unpause the DAG before running the 
backfill (and regular schedules will resume)
+- **Uncheck "Unpause on trigger"** (default): The backfill will run but the 
DAG stays paused

Review Comment:
   Did you manually test this, are the backfill runs succeeding now ?



##########
airflow-core/docs/administration-and-deployment/logging-monitoring/advanced-logging-configuration.rst:
##########
@@ -117,25 +119,11 @@ Example of custom logging for the 
``SQLExecuteQueryOperator`` and the ``HttpHook
 
       from copy import deepcopy
       from pydantic.utils import deep_update
-      from airflow.config_templates.airflow_local_settings import 
DEFAULT_LOGGING_CONFIG
 
-      LOGGING_CONFIG = deep_update(
-          deepcopy(DEFAULT_LOGGING_CONFIG),
-          {
-              "loggers": {
-                  
"airflow.task.operators.airflow.providers.common.sql.operators.sql.SQLExecuteQueryOperator":
 {
-                      "handlers": ["task"],
-                      "level": "DEBUG",
-                      "propagate": True,
-                  },
-                  
"airflow.task.hooks.airflow.providers.http.hooks.http.HttpHook": {
-                      "handlers": ["task"],
-                      "level": "WARNING",
-                      "propagate": False,
-                  },
-              }
-          },
-      )
+      # Airflow 3.x and later uses structlog for logging configuration.
+      # The legacy DEFAULT_LOGGING_CONFIG dict is no longer respected.
+      # See the migration note below for details.
+      # Example: To customize logging, use structlog configuration as 
described in the Airflow documentation.

Review Comment:
   Same



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to