[
https://issues.apache.org/jira/browse/AIRFLOW-1630?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Ash Berlin-Taylor resolved AIRFLOW-1630.
----------------------------------------
Resolution: Fixed
Fix Version/s: 1.9.0
> Tasks do not end up in state UPSTREAM_FAILED consistently
> ---------------------------------------------------------
>
> Key: AIRFLOW-1630
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1630
> Project: Apache Airflow
> Issue Type: Bug
> Affects Versions: 1.8.2
> Environment: Ubuntu 16.04, PostgreSQL 9.6.5, Python 2.7.12
> Reporter: Nikolay Petrachkov
> Priority: Major
> Fix For: 1.9.0
>
> Attachments: Screen Shot 2017-09-21 at 17.36.37.png
>
>
> Given a simple DAG with 2 tasks: BashOperator and DummyOperator
> BashOperator has a command "exit 1"
> BashOperator is upstream for DummyOperator.
> When we run this DAG we expect BashOperator to fail and DummyOperator to be
> in state UPSTREAM_FAILED.
> Actual result: BashOperator is in state FAILED and DummyOperator is in state
> None.
> Code:
> {code:python}
> from airflow import DAG
> from datetime import datetime
> from airflow.operators.dummy_operator import DummyOperator
> from airflow.operators.bash_operator import BashOperator
> default_args = {
> 'owner': 'airflow',
> 'start_date': datetime(2017, 9, 20),
> 'retries': 0
> }
> dag = DAG(
> 'delivery-failed',
> default_args=default_args,
> schedule_interval=None
> )
> failed_bash = "exit 1"
> bash_task = BashOperator(
> task_id='bash-task',
> bash_command=failed_bash,
> dag=dag
> )
> end_task = DummyOperator(
> task_id='end',
> dag=dag
> )
> bash_task >> end_task
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)