bin-lian commented on code in PR #46817:
URL: https://github.com/apache/airflow/pull/46817#discussion_r1957901859
##########
providers/apache/spark/src/airflow/providers/apache/spark/hooks/spark_submit.py:
##########
@@ -557,7 +557,7 @@ def submit(self, application: str = "", **kwargs: Any) ->
None:
# Check spark-submit return code. In Kubernetes mode, also check the
value
# of exit code in the log, as it may differ.
- if returncode or (self._is_kubernetes and self._spark_exit_code != 0):
+ if returncode:
Review Comment:
SparkSubmitOperator on kubernetes .Regardless of cluster or client mode, you
only need to monitor the final status of the subprocess. The final status of
the subprocess is the final status of the spark program.
1.client mode:The driver program runs on the subprocess machine, and the
state of the subprocess is the state of the spark program.
2.cluster mode:The driver program runs in other pods, and SparkSubmit is a
child process that can reflect the final status of the spark program.

The running status of Spark can be determined by using the child process
return code. The exit code in the log of Spark on Kubernetes will cause the
Spark task status to be abnormal.
Allows users to customize the exit. The official website describes it as
being used in yarn cluster mode.

--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]