[ 
https://issues.apache.org/jira/browse/SPARK-6069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14341809#comment-14341809
 ] 

Pat Ferrel commented on SPARK-6069:
-----------------------------------

Embarrassed to say still on Hadoop 1.2.1 and so no yarn. The packaging is not 
in the app jar but a separate pruned down dependencies-only jar. I can see why 
yarn would throw a unique kink into the situation.  So I guess you ran into 
this and had to use the {{user.classpath.first}} work around or are you saying 
it doesn't occur in oryx?

Still none of this should be necessary, right? Why else would jars be specified 
in to context creation? We do have a work around if someone has to work with 
1.2.1 but because of that it doesn't seem like a good version to recommend. 
Maybe I'll try 1.2 and install H2 and yarn--which seems like what the distros 
support.

> Deserialization Error ClassNotFoundException with Kryo, Guava 14
> ----------------------------------------------------------------
>
>                 Key: SPARK-6069
>                 URL: https://issues.apache.org/jira/browse/SPARK-6069
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.1
>         Environment: Standalone one worker cluster on localhost, or any 
> cluster
>            Reporter: Pat Ferrel
>            Priority: Critical
>
> A class is contained in the jars passed in when creating a context. It is 
> registered with kryo. The class (Guava HashBiMap) is created correctly from 
> an RDD and broadcast but the deserialization fails with ClassNotFound.
> The work around is to hard code the path to the jar and make it available on 
> all workers. Hard code because we are creating a library so there is no easy 
> way to pass in to the app something like:
> spark.executor.extraClassPath      /path/to/some.jar



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to