Haapalaj opened a new issue, #60969:
URL: https://github.com/apache/airflow/issues/60969

   ### Apache Airflow version
   
   3.1.6
   
   ### If "Other Airflow 3 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   When logging or printing out from DAG on_failure_callback executed code, 
nothing get logged either in the task log or in the scheduler logs.
   
   
   ### What you think should happen instead?
   
   The DAG on_failure_callback execution should get logs written at least in 
the scheduler logs.
   
   ### How to reproduce
   
   Could run this example DAG that does the on_failure_callback. The file is 
written, but no logs seen.
   
   ```
   from airflow.sdk import DAG
   from airflow.providers.standard.operators.python import PythonOperator
   from datetime import datetime, timedelta
   import os
   
   import logging
   logger = logging.getLogger(__name__)
   
   name='TEST_PYTHON'
   
   def on_fail(context):
       print(f'on_failure_callback called. Context: {context}')
       logger.info(f'logger: on_failure_callback called. Context: {context}')
       logger.warning(f'logger: on_failure_callback called. Context: {context}')
       logger.error(f'logger: on_failure_callback called. Context: {context}')
       with open("/tmp/on_fail_output.txt", "w") as f:
           f.write("failed")
   
   default_args = {
       'owner': 'airflow',
       'depends_on_past': False,
       'start_date': datetime(2015, 6, 1),
       'email_on_failure': False,
       'email_on_retry': False,
       'retries': 0,
       'retry_delay': timedelta(minutes=1),
       'type': 'utf-8'
   }
   
   dag = DAG(name, 
             default_args=default_args, 
             schedule='* * * * *',
             on_failure_callback=on_fail
       )
   
   previous_task = None
   
   def run_in_task():
       raise Exception('The task failed')
   
   for x in range(1, 2):
       task = PythonOperator(
           task_id = name + '_task' + str(x),
           python_callable = run_in_task,
           dag = dag)
   
       if previous_task:
           task.set_upstream(previous_task)
       previous_task = task
   
   ```
   
   ### Operating System
   
   Debian GNU/Linux 12 (bookworm)
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   airflow 3.1.6 (Image apache/airflow:3.1.6-python3.12
   apache-airflow-providers-standard==1.10.1
   apache-airflow-providers-snowflake==6.8.0
   apache-airflow-providers-amazon==9.18.1
   apache-airflow-providers-google==19.2.0
   apache-airflow-providers-microsoft-azure==12.10.0
   apache-airflow-providers-odbc==4.11.0
   apache-airflow-providers-postgres==6.5.1
   apache-airflow-providers-databricks==7.8.1
   apache-airflow-task-sdk==1.1.6
   apache-airflow-providers-fab==3.1.2
   
   python3.12
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to