I would like to use FunSuite <http://doc.scalatest.org/2.2.1/index.html#org.scalatest.FunSuite> to test my Spark jobs by extending FunSuite with a new function, called |localTest|, that runs a test with a default SparkContext:

|class  SparkFunSuite  extends  FunSuite  {

  def  localTest(name:  String)(f:  SparkContext  =>  Unit)  :  Unit  =  {
    val  conf=  new  SparkConf().setAppName(name).setMaster("local")
    val  sc=  new  SparkContext(conf)
    try  {
      this.test(name)(f(sc))
    }  finally  {
      sc.stop
    }
  }
}|

Then I can add tests easily to my testing suites:

|class  MyTestSuite  extends  SparkFunSuite  {

  localTest("My Spark test")  {  sc=>
    assertResult(2)(sc.parallelize(Seq(1,2,3)).filter(_<=  2).map(_+  1).count)
  }
}|

The problem is that when I run the tests I get a |NullPointerException|:

|[info]  MyTestSuite:
[info]  -  My  Spark  test***  FAILED***
[info]    java.lang.NullPointerException:
[info]    at 
org.apache.spark.SparkContext.defaultParallelism(SparkContext.scala:1215)
[info]    at 
org.apache.spark.SparkContext.parallelize$default$2(SparkContext.scala:435)
[info]    atMyTestSuite$$anonfun$1.apply(FunSuiteTest.scala:24)
[info]    atMyTestSuite$$anonfun$1.apply(FunSuiteTest.scala:23)
[info]    
atSparkFunSuite$$anonfun$localTest$1.apply$mcV$sp(FunSuiteTest.scala:13)
[info]    atSparkFunSuite$$anonfun$localTest$1.apply(FunSuiteTest.scala:13)
[info]    atSparkFunSuite$$anonfun$localTest$1.apply(FunSuiteTest.scala:13)
[info]    at 
org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
[info]    at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info]    at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]    ...|

What is causing the |NullPointerException|? More importantly, why I get this error while the testing suite of Spark, that does something similar, doesn't? Can somebody explain me the difference between them?

I'm using Scala 2.10.4 with |spark-core| 1.0.2 and |scalatest| 2.2.2.

Reply via email to