[ https://issues.apache.org/jira/browse/SPARK-26365?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17094932#comment-17094932 ]
Lorenzo Pisani commented on SPARK-26365: ---------------------------------------- I'm also seeing this behavior specifically with a "cluster" deploy mode. The driver pod is failing properly but the pod that executed spark-submit is exiting with a status code of 0. This makes it very difficult to monitor the job and detect failures. > spark-submit for k8s cluster doesn't propagate exit code > -------------------------------------------------------- > > Key: SPARK-26365 > URL: https://issues.apache.org/jira/browse/SPARK-26365 > Project: Spark > Issue Type: Bug > Components: Kubernetes, Spark Submit > Affects Versions: 2.3.2, 2.4.0 > Reporter: Oscar Bonilla > Priority: Minor > > When launching apps using spark-submit in a kubernetes cluster, if the Spark > applications fails (returns exit code = 1 for example), spark-submit will > still exit gracefully and return exit code = 0. > This is problematic, since there's no way to know if there's been a problem > with the Spark application. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org