YangMuye opened a new issue #14265:
URL: https://github.com/apache/airflow/issues/14265


   **Apache Airflow version**: 2.0.0
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**:
   - **OS** (e.g. from /etc/os-release): redhat 7.9
   - **Kernel** (e.g. `uname -a`): 3.10.0-1160.11.1.el7.x86_64
   - **Install tools**: pip
   - **Others**:
   
   **What happened**:
   
   I cleared a success DAG run and it failed to be scheduled again with the 
following error message:
   {scheduler_job.py:1639} INFO - Run scheduled__2021-02-02T16:00:00+00:00 of 
some_job has timed-out
   After removing dagrun_timeout, the same dag run can be rescheduled.
   
   **What you expected to happen**:
   
   I expect "timeout" is counted from the moment the DAG is reset.
   
   **Anything else we need to know**:
   
   I can reproduce it with the following code. Not sure if it is the intended 
behavior.
   
   ```
   from airflow import DAG
   from airflow.operators.dummy import DummyOperator
   import pendulum
   with DAG("test_timeout",
       default_args={'owner': 'airflow'},
       start_date= pendulum.yesterday(),
       schedule_interval='@daily',
       # add the following parameter after the first run is complete and then 
clear the success run
       # dagrun_timeout=pendulum.duration(minutes=1),
   ) as dag:
       DummyOperator(task_id='dummy')
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to