ashb commented on issue #59634:
URL: https://github.com/apache/airflow/issues/59634#issuecomment-3744885402

   > In Airflow 3.x, task execution runs in an isolated context via the Task 
SDK, and only task-scoped loggers are guaranteed to be captured in task logs.
   
   That is not the intent at all.
   
   _All_ loggers that produce any output in a task should be captured and sent 
to the task log file
   
   Given this dag:
   
   ```python
   from airflow.sdk import DAG, task
   
   from datetime import datetime, timedelta
   
   
   default_args = {
       "depends_on_past": False,
       "email_on_failure": True,
       "email_on_retry": False,
       "retry_delay": timedelta(seconds=1),
   }
   with DAG(
       "Get_Conn",
       start_date=datetime(2025, 1, 1),
       catchup=False,
       schedule=None,
       default_args=default_args,
   ) as dag:
   
       # @task.virtualenv(requirements=["-e ./task-sdk", "pudb"], 
expect_airflow=True)
       @task
       def get_conn(event=None):
           from structlog import get_logger
           from logging import getLogger
           if event is not None:
               get_logger("hello").info("Returned from trigger")
               return
   
           get_logger("hello").info("Hi %d %s", 2, "%d")
           from structlog import get_logger
           get_logger("hello").info("Hi %d")
           from airflow.sdk import Connection
   
           getLogger("hello").info("Hi %s", "%d")
   
           print(Connection.get("test_conn"))
       get_conn()
   ```
   
   I'll rerun this dag as soon as I get my local dev env set up again, but I 
know that I was seeing the `hello` logger showing up too.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to