WattsInABox commented on issue #18011:
URL: https://github.com/apache/airflow/issues/18011#issuecomment-921371431


   We're also seeing this. Most of our DAGs have well under 100 tasks, a few 
just under 200 tasks, 673 active DAGs, 179 paused DAGs. We do not use 
`wait_for_downstream` anywhere.
   
   We started seeing this after upgrading to 2.1.3 which we upgraded to 
specifically get the bug fix PR #16301, not sure if that bug might be related 
since we seem to be having weird status issues all over Airflow...
   
   We see this in all manner of DAGs, some with a very linear path, some that 
branch into 100 tasks and then back to 1, others with 2 pre-requisite tasks 
into the final task. 
   
   Behavior:
   - upstream tasks all successful
   - downstream task(s) marked as upstream_failed
   - **sometimes** an upstream task will have a previous run marked as failed 
but then it retries as successful, almost as if the downstream tasks get marked 
as upstream_failed on that run but then don't get cleared for the subsequent 
retry.  **But this does not always happen**: we have seen multiple dagruns a 
night have upstream_failed tasks where all tasks prior worked on their first 
attempt (or at least only have logs for 1 attempt).
   
   Please advise on what other information we can provide.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to