Hello,

I would like to know if there is a way of catching exception throw from
executor exception from the driver program. Here is an example:

try {
  val x = sc.parallelize(Seq(1,2,3)).map(e => e / 0).collect
} catch {
  case e: SparkException => {
    println(s"ERROR: $e")
    println(s"CAUSE: ${e.getCause}")
  }
}

Output:
ERROR: org.apache.spark.SparkException: Job aborted due to stage failure:
Task 1 in stage 1.0 failed 4 times, most recent failure: Lost task 1.3 in
stage 1.0 (TID 15, pio1.c.ace-lotus-714.internal):
java.lang.ArithmeticException: / by zero
at
$line71.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1.apply$mcII$sp(<console>:51)
...
CAUSE: null

The exception cause is a null value. Is there any way that I can catch the
ArithmeticException?

Thanks

Justin




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Catching-executor-exception-from-executor-in-driver-tp22495.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to