I'm trying to write a unit test to ensure that some functions I rely on
will always serialize and run correctly on a cluster.
In one of these functions I've deliberately added a "val x:Int = 1" which
should prevent this method from being serializable right?

In the test I've done:
   sc = new SparkContext("local[2]","test")
   ...
   val pdata = sc.parallelize(data)
   val c = pdata.map().collect()

The unit tests still complete with no errors; I'm guessing because spark
knows that local[2] doesn't require serialization? Is there someway I can
force spark to run like it would do on a real cluster?


tks
shay

Reply via email to