Nikolay Petrachkov created AIRFLOW-1630:
-------------------------------------------

             Summary: Tasks do not end up in state UPSTREAM_FAILED consistently
                 Key: AIRFLOW-1630
                 URL: https://issues.apache.org/jira/browse/AIRFLOW-1630
             Project: Apache Airflow
          Issue Type: Bug
    Affects Versions: 1.8.2
         Environment: Ubuntu 16.04
            Reporter: Nikolay Petrachkov
         Attachments: Screen Shot 2017-09-21 at 17.36.37.png

Given a simple DAG with 2 tasks: BashOperator and DummyOperator
BashOperator has a command "exit 1"
BashOperator is upstream for DummyOperator.

When we run this DAG we expect BashOperator to fail and DummyOperator to be in 
state UPSTREAM_FAILED.

Actual result: BashOperator is in state FAILED and DummyOperator is in state 
None. 

Code:

{code:python}
from airflow import DAG
from datetime import datetime
from airflow.operators.dummy_operator import DummyOperator
from airflow.operators.bash_operator import BashOperator

default_args = {
    'owner': 'airflow',
    'start_date': datetime(2017, 9, 20),
    'retries': 0
}
dag = DAG(
    'delivery-failed',
    default_args=default_args,
    schedule_interval=None
)

failed_bash = "exit 1"

bash_task = BashOperator(
    task_id='bash-task',
    bash_command=failed_bash,
    dag=dag
)

end_task = DummyOperator(
    task_id='end',
    dag=dag
)

bash_task >> end_task
{code}




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to