Weird, it seems like this is trying to use the SparkContext before it's 
initialized, or something like that. Have you tried unrolling this into a 
single method? I wonder if you just have multiple versions of these libraries 
on your classpath or something.

Matei

On Oct 4, 2014, at 1:40 PM, Mario Pastorelli <mario.pastore...@teralytics.ch> 
wrote:

> I would like to use FunSuite to test my Spark jobs by extending FunSuite with 
> a new function, called localTest, that runs a test with a default 
> SparkContext:
> 
> class SparkFunSuite extends FunSuite {
> 
>   def localTest(name : String)(f : SparkContext => Unit) : Unit = {
>     val conf = new SparkConf().setAppName(name).setMaster("local")
>     val sc = new SparkContext(conf)
>     try {
>       this.test(name)(f(sc))
>     } finally {
>       sc.stop
>     }
>   }
> }
> Then I can add tests easily to my testing suites:
> 
> class MyTestSuite extends SparkFunSuite {
> 
>   localTest("My Spark test") { sc =>
>     assertResult(2)(sc.parallelize(Seq(1,2,3)).filter(_ <= 2).map(_ + 
> 1).count)
>   }
> }
> The problem is that when I run the tests I get a NullPointerException:
> 
> [info] MyTestSuite:
> [info] - My Spark test *** FAILED ***
> [info]   java.lang.NullPointerException:
> [info]   at 
> org.apache.spark.SparkContext.defaultParallelism(SparkContext.scala:1215)
> [info]   at 
> org.apache.spark.SparkContext.parallelize$default$2(SparkContext.scala:435)
> [info]   at MyTestSuite$$anonfun$1.apply(FunSuiteTest.scala:24)
> [info]   at MyTestSuite$$anonfun$1.apply(FunSuiteTest.scala:23)
> [info]   at 
> SparkFunSuite$$anonfun$localTest$1.apply$mcV$sp(FunSuiteTest.scala:13)
> [info]   at SparkFunSuite$$anonfun$localTest$1.apply(FunSuiteTest.scala:13)
> [info]   at SparkFunSuite$$anonfun$localTest$1.apply(FunSuiteTest.scala:13)
> [info]   at 
> org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
> [info]   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> [info]   ...
> What is causing the NullPointerException? More importantly, why I get this 
> error while the testing suite of Spark, that does something similar, doesn't? 
> Can somebody explain me the difference between them?
> I'm using Scala 2.10.4 with spark-core 1.0.2 and scalatest 2.2.2.
> 

Reply via email to