[ https://issues.apache.org/jira/browse/SPARK-9479?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-9479: ----------------------------------- Assignee: (was: Apache Spark) > ReceiverTrackerSuite fails for maven build > ------------------------------------------ > > Key: SPARK-9479 > URL: https://issues.apache.org/jira/browse/SPARK-9479 > Project: Spark > Issue Type: Bug > Components: Streaming, Tests > Reporter: Shixiong Zhu > > The test failure is here: > https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-Maven-with-YARN/3109/ > I saw the following exception in the log: > {code} > org.apache.spark.SparkException: Job aborted due to stage failure: Task > serialization failed: java.lang.NullPointerException > org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80) > org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34) > org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63) > org.apache.spark.SparkContext.broadcast(SparkContext.scala:1297) > org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:834) > {code} > This exception is because SparkEnv.get returns null. > I found the maven build is different from the sbt build. The maven build will > create all Suite classes at the beginning. `ReceiverTrackerSuite` creates > StreamingContext (SparkContext) in the constructor. That means SparkContext > is created very early. And the global SparkEnv will be set to null in the > previous test. Therefore we saw the above exception when running `Receiver > tracker - propagates rate limit` in `ReceiverTrackerSuite`. This test was > added recently. > Note: the previous tests in `ReceiverTrackerSuite` didn't use SparkContext > actually, that's why we didn't see such failure before. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org