erdos2n opened a new issue, #30264:
URL: https://github.com/apache/airflow/issues/30264

   ### Apache Airflow version
   
   2.5.2
   
   ### What happened
   
   Users are experiencing the following:
   
   * A DAG begins to run
   * Task(s) go into running state, as expected
   * The DagRun times out, marking any currently running task as SKIPPED
   * Because tasks are not mark as failed the `on_failure_callback` never gets 
revoked
   
   Here are some example logs:
   
   ```
   [2023-03-22, 16:30:02 PDT] {local_task_job.py:266} WARNING - DagRun timed 
out after 4:00:02.394287.
   [2023-03-22, 16:30:07 PDT] {local_task_job.py:266} WARNING - DagRun timed 
out after 4:00:07.447373.
   [2023-03-22, 16:30:07 PDT] {local_task_job.py:272} WARNING - State of this 
instance has been externally set to skipped. Terminating instance.
   [2023-03-22, 16:30:07 PDT] {process_utils.py:129} INFO - Sending 
Signals.SIGTERM to group 8515. PIDs of all processes in the group: [8515]
   ```
   
   
   ### What you think should happen instead
   
   Once a DagRun times out, tasks that are currently in RUNNING should be 
marked as FAILED and downstream tasks should be marked as UPSTREAM_FAILED
   
   ### How to reproduce
   
   The following DAG will cause this intermittently
   
   ```python
   import time
   import logging
   
   from airflow.decorators import dag, task
   from airflow.utils.dates import datetime, timedelta
   
   
   
   @task
   def task_1():
       import random
       pulses = random.randint(5, 10)
       for i in range(pulses):
           logging.info(f"pulsing: pulse...{i}")
           time.sleep(4)
   
   
   @task
   def task_2():
       import random
       pulses = random.randint(10, 20)
       for i in range(pulses):
           logging.info(f"pulsing: pulse...{i}")
           time.sleep(5)
   
   @task
   def downstream_finished_task():
       logging.info("task finished")
       time.sleep(20)
   
   @dag(dag_id="dagrun_interval_test",
        schedule_interval="*/5 * * * *",
        start_date=datetime(2023, 3, 23),
        dagrun_timeout=timedelta(seconds=30),
        catchup=False)
   def my_dag():
       return [task_1(), task_2()] >> downstream_finished_task()
   
   
   dag = my_dag()
   ```
   
   
   * Running tasks marked skipped
   * Downstream left with `no status`
   
   See screenshot
   ![Screen Shot 2023-03-23 at 4 23 34 
PM](https://user-images.githubusercontent.com/26331746/227366436-6fac0b18-6bc3-432d-8c73-3f0617045220.png)
   
   ### Operating System
   
   MacOS
   
   ### Versions of Apache Airflow Providers
   
   N/A
   
   ### Deployment
   
   Astronomer
   
   ### Deployment details
   
   Airflow Version 2.5.2
   
   ### Anything else
   
   Every time a DagRun times out
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to