Are you running from the spark shell or from a standalone job?

On Mon, Mar 17, 2014 at 4:17 PM, Walrus theCat <walrusthe...@gmail.com>wrote:

> Hi,
>
> I'm getting this stack trace, using Spark 0.7.3.  No references to
> anything in my code, never experienced anything like this before.  Any
> ideas what is going on?
>
> java.lang.ClassCastException: spark.SparkContext$$anonfun$9 cannot be cast
> to scala.Function2
>     at spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:43)
>     at spark.scheduler.ResultTask.readExternal(ResultTask.scala:106)
>     at
> java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1837)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at spark.JavaDeserializationStream.readObject(JavaSerializer.scala:23)
>     at spark.JavaSerializerInstance.deserialize(JavaSerializer.scala:45)
>     at spark.executor.Executor$TaskRunner.run(Executor.scala:96)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:744)
>

Reply via email to