Does this perhaps have to do with the spark.closure.serializer?

On Sat, May 3, 2014 at 7:50 AM, Soren Macbeth <so...@yieldbot.com> wrote:

> Poking around in the bowels of scala, it seems like this has something to
> do with implicit scala -> java collection munging. Why would it be doing
> this and where? The stack trace given is entirely unhelpful to me. Is there
> a better one buried in my task logs? None of my tasks actually failed, so
> it seems that it dying while trying to fetch results from my tasks to
> return back to the driver.
>
> Am I close?
>
>
> On Fri, May 2, 2014 at 3:35 PM, Soren Macbeth <so...@yieldbot.com> wrote:
>
>> Hallo,
>>
>> I've getting this rather crazy kryo exception trying to run my spark job:
>>
>> Exception in thread "main" org.apache.spark.SparkException: Job aborted:
>> Exception while deserializing and fetching task:
>> com.esotericsoftware.kryo.KryoException:
>> java.lang.IllegalArgumentException: Can not set final
>> scala.collection.convert.Wrappers field
>> scala.collection.convert.Wrappers$SeqWrapper.$outer to my.custom.class
>> Serialization trace:
>> $outer (scala.collection.convert.Wrappers$SeqWrapper)
>>         at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1020)
>>         at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$abortStage$1.apply(DAGScheduler.scala:1018)
>>         at
>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>>         at
>> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>>         at org.apache.spark.scheduler.DAGScheduler.org
>> $apache$spark$scheduler$DAGScheduler$$abortStage(DAGScheduler.scala:1018)
>>         at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)
>>         at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$processEvent$10.apply(DAGScheduler.scala:604)
>>         at scala.Option.foreach(Option.scala:236)
>>         at
>> org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:604)
>>         at
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$start$1$$anon$2$$anonfun$receive$1.applyOrElse(DAGScheduler.scala:190)
>>         at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
>>         at akka.actor.ActorCell.invoke(ActorCell.scala:456)
>>         at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
>>         at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>>         at
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
>>         at
>> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>>         at
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>>         at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>>         at
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>
>> I have a kryo serializer for my.custom.class and I've registered it using
>> a custom registrator on my context object. I've tested the custom
>> serializer and the registrator locally and they both function as expected.
>> This job is running spark 0.9.1 under mesos in fine grained mode.
>>
>> Please help!
>>
>
>

Reply via email to