JonnyWaffles opened a new issue, #24948:
URL: https://github.com/apache/airflow/issues/24948

   ### Apache Airflow version
   
   2.2.4
   
   ### What happened
   
   Hi team, when executing `airflow dags test <dag_id> <logical_date>`, and the 
`DagRun` enters the failure state, Airflow prints the exception and gracefully 
exits with code 0. I would expect a test failure to return a failure status 
code so any callers would be informed the test failed. I searched the Issues 
but did not see any discussion around this.
   
   I noticed this behavior during the following scenario. I need to trigger my 
deployed Airflow cluster's integration test kit from my remote CICD controller, 
therefore I wrote a simple "stand-alone" `Dag` for the CICD server to execute 
locally as a script, and ran it with `dags test` to bypass the scheduler and 
run it as a local process.
   
   in pseudo code 
   ```python
       trigger_test_kit_op = TriggerDagRunOperator(
           task_id="trigger_test_kit_op",
           trigger_dag_id="platform.integration_test",
           execution_date="{{ ts }}",
           reset_dag_run=True,
       )
       wait_for_completion = ExternalTaskSensor(
           task_id="wait_for_completion",
           external_dag_id="platform.integration_test",
           poke_interval=10,
           timeout=60 * 15,  # We may need to adjust this as we add tests
           allowed_states=(State.SUCCESS, State.FAILED),
       )
       @task()
       def publish_result():
          # if failure code raise exception and exit 1
   ```
   I noticed my test kit failed but the CICD server didn't notice as the CLI 
return code is always 0. This is probably because the DebugExecutor loop 
catches all exceptions and successfully marks the DagRun as a failure. 
https://github.com/apache/airflow/blob/a95c5e214b34a0a4f0617d17e1c5515fb1bb24b2/airflow/executors/debug_executor.py#L73-L85
   
   Should the test command exit with an exception status or should it always 
return 0 like the current implementation? If the later, what's a good, 
alternative way to run my Dag as a script and signal failure?
   
   ### What you think should happen instead
   
   Not sure if this is an error or not. I'm just curious about the design and 
if anyone has thoughts.
   
   ### How to reproduce
   
   Create a dag that always fails and run it as a dag test
   ```python
   with DAG(
       "failure",
       default_args=default_args,
       schedule_interval=None,
       start_date=datetime(2021, 1, 1),
       catchup=False,
       is_paused_upon_creation=False,
   ) as dag:
   
       @task()
       def failure():
           raise Exception("I die")
       failure()
   ```
   ```shell
   airflow dags test failure $(date -I)
   ```
   
   ### Operating System
   
   Ubuntu 20.04.4 LTS
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other
   
   ### Deployment details
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to