pysenic opened a new issue #18229:
URL: https://github.com/apache/airflow/issues/18229


   Version: 2.1.3
   OS: CentOS 7
   Executor: CeleryExecutor
   
   I update airflow to 2.1.3 from 1.10.15(New environment redeployment),I found 
the problem still unsolved,My dag args is:
   
   ```
   default_args = {
       'owner': 'admin',
       'depends_on_past': True,
       'start_date': start_time,
       'wait_for_downstream': True,
       'on_failure_callback': failure_callback,
       'on_retry_callback': retry_callback,
       'retries': 1,
       'retry_delay': timedelta(seconds=60)
   }
   ```
   The DAG will be scheduled hourly
   
   Assuming that task is  A >> B >> C >> D >> E >> F 
   
   now A B C is success and D is running,the next batch task A B is success,i 
think the next batch task C will wait for run until the D is success, because 
set wait_for_downstream=True,In fact,the task C report a failure
   
   
   *** Reading local file: 
/home/hadoop/bi/log/airflow/dag_data_etl_1h/ods.session_logs/2021-09-13T06:03:00+00:00/2.log
   [2021-09-14 15:22:04,492] {__init__.py:51} DEBUG - Loading core task runner: 
StandardTaskRunner
   [2021-09-14 15:22:04,501] {base_task_runner.py:62} DEBUG - Planning to run 
as the  user
   [2021-09-14 15:22:04,503] {taskinstance.py:618} DEBUG - Refreshing 
TaskInstance <TaskInstance: dag_data_etl_1h.ods.session_logs 
2021-09-13T06:03:00+00:00 [queued]> from DB
   [2021-09-14 15:22:04,509] {taskinstance.py:656} DEBUG - Refreshed 
TaskInstance <TaskInstance: dag_data_etl_1h.ods.session_logs 
2021-09-13T06:03:00+00:00 [queued]>
   [2021-09-14 15:22:04,516] {taskinstance.py:918} DEBUG - <TaskInstance: 
dag_data_etl_1h.ods.session_logs 2021-09-13T06:03:00+00:00 [queued]> dependency 
'Task Instance State' PASSED: True, Task state queued was valid.
   [2021-09-14 15:22:04,516] {taskinstance.py:918} DEBUG - <TaskInstance: 
dag_data_etl_1h.ods.session_logs 2021-09-13T06:03:00+00:00 [queued]> dependency 
'Not In Retry Period' PASSED: True, The task instance was not marked for 
retrying.
   [2021-09-14 15:22:04,516] {taskinstance.py:918} DEBUG - <TaskInstance: 
dag_data_etl_1h.ods.session_logs 2021-09-13T06:03:00+00:00 [queued]> dependency 
'Task Instance Not Running' PASSED: True, Task is not in running state.
   [2021-09-14 15:22:04,524] {taskinstance.py:918} DEBUG - <TaskInstance: 
dag_data_etl_1h.ods.session_logs 2021-09-13T06:03:00+00:00 [queued]> dependency 
'Previous Dagrun State' PASSED: False, The tasks downstream of the previous 
task instance <TaskInstance: dag_data_etl_1h.ods.session_logs 2021-09-13 
05:03:00+00:00 [success]> haven't completed (and wait_for_downstream is True).
   [2021-09-14 15:22:04,524] {taskinstance.py:897} INFO - Dependencies not met 
for <TaskInstance: dag_data_etl_1h.ods.session_logs 2021-09-13T06:03:00+00:00 
[queued]>, dependency 'Previous Dagrun State' FAILED: The tasks downstream of 
the previous task instance <TaskInstance: dag_data_etl_1h.ods.session_logs 
2021-09-13 05:03:00+00:00 [success]> haven't completed (and wait_for_downstream 
is True).
   [2021-09-14 15:22:04,525] {local_task_job.py:96} INFO - Task is not able to 
be run
   
   
   There is no abnormality, the task is reported inexplicably wrong
   
   
![image](https://user-images.githubusercontent.com/20294807/133218204-bb31a20b-feee-454a-9cab-e66e20b8b18c.png)
   
   
   Earlier this year, I also upgraded the 2.x version, but this problem has not 
been resolved.
   
   _Originally posted by @pysenic in 
https://github.com/apache/airflow/discussions/18227_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to