Re: Intermittent test failures

2014-12-17 Thread Marius Soutier
Using TestSQLContext from multiple tests leads to: SparkException: : Task not serializable ERROR ContextCleaner: Error cleaning broadcast 10 java.lang.NullPointerException at org.apache.spark.broadcast.TorrentBroadcast$.unpersist(TorrentBroadcast.scala:246) at org.apache.spark.b

Re: Intermittent test failures

2014-12-15 Thread Marius Soutier
Ok, maybe these test versions will help me then. I’ll check it out. On 15.12.2014, at 22:33, Michael Armbrust wrote: > Using a single SparkContext should not cause this problem. In the SQL tests > we use TestSQLContext and TestHive which are global singletons for all of our > unit testing. >

Re: Intermittent test failures

2014-12-15 Thread Michael Armbrust
Using a single SparkContext should not cause this problem. In the SQL tests we use TestSQLContext and TestHive which are global singletons for all of our unit testing. On Mon, Dec 15, 2014 at 1:27 PM, Marius Soutier wrote: > > Possible, yes, although I’m trying everything I can to prevent it, i.

Re: Intermittent test failures

2014-12-15 Thread Marius Soutier
Possible, yes, although I’m trying everything I can to prevent it, i.e. fork in Test := true and isolated. Can you confirm that reusing a single SparkContext for multiple tests poses a problem as well? Other than that, just switching from SQLContext to HiveContext also provoked the error. On

Re: Intermittent test failures

2014-12-15 Thread Michael Armbrust
Is it possible that you are starting more than one SparkContext in a single JVM with out stopping previous ones? I'd try testing with Spark 1.2, which will throw an exception in this case. On Mon, Dec 15, 2014 at 8:48 AM, Marius Soutier wrote: > > Hi, > > I’m seeing strange, random errors when r

Intermittent test failures

2014-12-15 Thread Marius Soutier
Hi, I’m seeing strange, random errors when running unit tests for my Spark jobs. In this particular case I’m using Spark SQL to read and write Parquet files, and one error that I keep running into is this one: org.apache.spark.SparkException: Job aborted due to stage failure: Task 19 in stage