jscheffl commented on PR #63907:
URL: https://github.com/apache/airflow/pull/63907#issuecomment-4187640083

   > @jscheffl My reading of this PR and issue is that it goes beyond what you 
get from `default_args`. Retries set through `default_args` are task retries. 
They only re-run the task that failed. For example, with `task_a` >> `task_b`, 
if `task_b` fails, Airflow retries only `task_b`. `task_a` already succeeded 
and is not run again. My understanding of this change is that it adds a DAG run 
level retry so the run can be tried again in a way that can include upstream 
work like `task_a` again, not only the last failed task. So setting retries in 
default_args for all tasks does not fully replace that behavior. If I 
misunderstood the scheduler or model behavior, the author can correct me.
   
   Yes, okay then it is not a "small addition or bugfix" but something that 
should be first aligned with the development community if such use case should 
be supported and the additional compelxity is accepted. Scheduling is already 
complex and adding more parameters, loops and complexity is something that need 
to be accepted.
   
   Can you please add a email to the devliust as [DISCUSS] whether this feature 
shall be accepted and if the method of implementation is also the right way?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to