[ https://issues.apache.org/jira/browse/SPARK-26365?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17601827#comment-17601827 ]
Shrikant Prasad commented on SPARK-26365: ----------------------------------------- Spark submit command exit code ($?) as 0 is okay as there is no error in job submission. It's the job which failed and that info we do get in container exit code (1). When job submission fails, we do get proper exit code. So it doesn't seems to be a bug. {code:java} container status: container name: spark-kubernetes-driver container image: ****** container state: terminated container started at: 2022-09-08T13:40:39Z container finished at: 2022-09-08T13:40:43Z exit code: 1 termination reason: Error {code} > spark-submit for k8s cluster doesn't propagate exit code > -------------------------------------------------------- > > Key: SPARK-26365 > URL: https://issues.apache.org/jira/browse/SPARK-26365 > Project: Spark > Issue Type: Bug > Components: Kubernetes, Spark Core, Spark Submit > Affects Versions: 2.3.2, 2.4.0, 3.0.0, 3.1.0 > Reporter: Oscar Bonilla > Priority: Major > Attachments: spark-2.4.5-raise-exception-k8s-failure.patch, > spark-3.0.0-raise-exception-k8s-failure.patch > > > When launching apps using spark-submit in a kubernetes cluster, if the Spark > applications fails (returns exit code = 1 for example), spark-submit will > still exit gracefully and return exit code = 0. > This is problematic, since there's no way to know if there's been a problem > with the Spark application. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org