> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> at
> scala.concurrent.forkjoin.ForkJoinPool.runW
ForkJoinPool.java:1339)
at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
From: ilike...@gmail.com
Date: Mon, 23 Jun 2014 18:00:27 -0700
Subject: Re: DAGScheduler: Failed t
erializable: java.io.NotSerializableException:
> and not DAGScheduler: Failed to run foreach
>
> If I call printScoreCanndedString with a hard-coded string and identical
> 2nd parameter, it works fine. However for my application that is not
> sufficient.
>
The subject should be: org.apache.spark.SparkException: Job aborted due to
stage failure: Task not serializable: java.io.NotSerializableException: and
not DAGScheduler: Failed to run foreach
If I call printScoreCanndedString with a hard-coded string and identical 2nd
parameter, it works fine
4/06/23 16:45:04 INFO DAGScheduler: Missing
parents: List()14/06/23 16:45:04 INFO DAGScheduler: Submitting Stage 0
(MappedRDD[1] at textFile at :12), which has no missing
parents14/06/23 16:45:04 INFO DAGScheduler: Failed to run foreach at
CalculateScore.scala:51org.apache.spark.SparkException: