Juan M George created AIRFLOW-5540:
--------------------------------------

             Summary: task with SparkSubmitOperator  does not  fail if the  
spark job it executes fails.
                 Key: AIRFLOW-5540
                 URL: https://issues.apache.org/jira/browse/AIRFLOW-5540
             Project: Apache Airflow
          Issue Type: Wish
          Components: operators
    Affects Versions: 1.10.3
         Environment: RHEL 7.3 with default airflow  installation.
            Reporter: Juan M George
         Attachments: airfow_issues

In  my  test Dag, I have a task that uses SparkSubmitOperator operator to 
execute a spark job  that  reads from a table in a database and  does some 
processing and write into a file.   In  a scenario, where source table from 
where  I read data does not exist, the spark job fails but the  task  is still 
be shown as  executed successfully.  I don't see any other way to handle this 
business logic failure from the operator side.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to