Hello folks,

I'm experiencing an unexpected behaviour, that suggests me thinking about
my missing notions on how Spark works. Let's say I have a Spark driver that
invokes a function like:

----- in myDriver -----

val sparkContext = new SparkContext(mySparkConf)
val inputPath = "file://home/myUser/project/resources/date=*/*"

val myResult = new MyResultFunction()(sparkContext, inputPath)

----- in MyResultFunctionOverRDD ------

class MyResultFunction extends Function2[SparkContext, String, RDD[String]]
with Serializable {
  override def apply(sparkContext: SparkContext, inputPath: String):
RDD[String] = {
    try {
      sparkContext.textFile(inputPath, 1)
    } catch {
      case t: Throwable => {
        myLogger.error(s"error: ${t.getStackTraceString}\n")
        sc.makeRDD(Seq[String]())
      }
    }
  }
}

What happens is that I'm *unable to catch exceptions* thrown by the
"textFile" method within the try..catch clause in MyResultFunction. In
fact, in a unit test for that function where I call it passing an invalid
"inputPath", I don't get an empty RDD as result, but the unit test exits
(and fails) due to exception not handled.

What am I missing here?

Thank you.

Best regards,
Roberto

Reply via email to