taharah commented on issue #34783:
URL: https://github.com/apache/airflow/issues/34783#issuecomment-1750713541

   @Taragolis thank you for looking into this so quickly! Shortly after opening 
this issue, I was able to identify the real root cause for why our logs stopped 
being written to S3. We had added a logger for `airflow`, which had propagate 
set to `false`, in order to only write those logs to a file and not the 
console. The config was:
   
   ```python
   LOGGING_CONFIG["loggers"]["airflow"] = {
       "handlers": ["file"],
       "level": LOG_LEVEL,
       "propagate": False,
   }
   ```
   
   After removing the aforementioned logger, our task logs began to show up as 
expected. However, the reason we hit this issue was still due to the changes 
made in #28440. The changes made in that PR introduced a hard requirement for 
the `airflow.task` logs to be propagated to the root logger in order for the 
handlers associated with `airflow.task` to be invoked.
   
   Thus, there still needs to be a couple of things done in order to fully 
resolve this issue.
   
   1. Update the task logging documentation to include a note about requiring 
that the `airflow.task` logs are able to be propagated to the root logger.
   2. While investigating this issue, I noticed a small bug with the new 
implementation that results in differing behavior than the original 
implementation. Namely, the configuration for the root logger is not reverted, 
i.e., the handlers and log level, when the context manager exits.
   
   I'll work on putting together PRs to make the necessary changes.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to