[
https://issues.apache.org/jira/browse/SPARK-1877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14002598#comment-14002598
]
Tathagata Das commented on SPARK-1877:
--------------------------------------
Can you please give us the steps to reproduce this problem. I am guessing this
can be reproduced using a local standalone cluster.
> ClassNotFoundException when loading RDD with serialized objects
> ---------------------------------------------------------------
>
> Key: SPARK-1877
> URL: https://issues.apache.org/jira/browse/SPARK-1877
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.0.0
> Environment: standalone Spark cluster, jdk 1.7
> Reporter: Bogdan Ghidireac
>
> When I load a RDD that has custom serialized objects, Spark throws
> ClassNotFoundException. This happens only when Spark is deployed as a
> standalone cluster, it works fine when Spark is local.
> I debugged the issue and I noticed that ObjectInputStream.resolveClass does
> not use ExecutorURLClassLoader set by SparkSubmit. You have to explicitly set
> the classloader in SparkContext.objectFile for ObjectInputStream when
> deserializing objects.
> Utils.deserialize[Array[T]](...., Thread.currentThread.getContextClassLoader)
> I will attach a patch shortly...
--
This message was sent by Atlassian JIRA
(v6.2#6252)