I may have found my problem. We have a scala wrapper on top of spark-submit
to run the shell command through scala.
We were kind of eating the exit code from spark-submit in that wrapper.
When I looked at what the actual exit code was stripping away the wrapper I
got 1.
So I think spark-submit is
Hi,
➜ spark git:(master) ✗ ./bin/spark-submit whatever || echo $?
Error: Cannot load main class from JAR file:/Users/jacek/dev/oss/spark/whatever
Run with --help for usage help or --verbose for debug output
1
I see 1 and there are other cases for 1 too.
Pozdrawiam,
Jacek Laskowski
Hello,
+1, i have exactly the same issue. I need the exit code to make a decision
on oozie executing actions. Spark-submit always returns 0 when catching the
exception. From spark 1.5 to 1.6.x, i still have the same issue... It would
be great to fix it or to know if there is some work around
Hi,
An interesting case. You don't use Spark resources whatsoever.
Creating a SparkConf does not use YARN...yet. I think any run mode
would have the same effect. So, although spark-submit could have
returned exit code 1, the use case touches Spark very little.
What version is that? Do you see
Hi All,
I wrote a test script which always throws an exception as below :
object Test {
def main(args: Array[String]) {
try {
val conf =
new SparkConf()
.setAppName("Test")
throw new RuntimeException("Some Exception")
println("all done!")
}