[ 
https://issues.apache.org/jira/browse/SPARK-6069?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-6069:
-----------------------------
    Priority: Critical  (was: Major)
     Summary: Deserialization Error ClassNotFoundException with Kryo, Guava 14  
(was: Deserialization Error ClassNotFound )

To clarify the properties situation, in Spark 1.2.x we have 
{{spark.files.userClassPathFirst}} _and_ {{spark.yarn.user.classpath.first}}. 
{{spark.driver.userClassPathFirst}} and {{spark.executor.userClassPathFirst}} 
are the new more logical versions in 1.3+ only. So ignore those.

{{spark.yarn.user.classpath.first}} is actually what I am setting:
https://github.com/OryxProject/oryx/blob/master/oryx-lambda/src/main/java/com/cloudera/oryx/lambda/BatchLayer.java#L153

But it sounds like you are not using YARN.

Guava 14.0.1 is packaged with the app:
https://github.com/OryxProject/oryx/blob/master/pom.xml#L233

I'm running this on 1.2.0 + YARN, and also local[*] + 1.3.0-SNAPSHOT.

My ? is whether this is perhaps not working for standalone in 1.2 but does in 
1.3, since there has been some overhaul to this mechanism since 1.2.

> Deserialization Error ClassNotFoundException with Kryo, Guava 14
> ----------------------------------------------------------------
>
>                 Key: SPARK-6069
>                 URL: https://issues.apache.org/jira/browse/SPARK-6069
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.1
>         Environment: Standalone one worker cluster on localhost, or any 
> cluster
>            Reporter: Pat Ferrel
>            Priority: Critical
>
> A class is contained in the jars passed in when creating a context. It is 
> registered with kryo. The class (Guava HashBiMap) is created correctly from 
> an RDD and broadcast but the deserialization fails with ClassNotFound.
> The work around is to hard code the path to the jar and make it available on 
> all workers. Hard code because we are creating a library so there is no easy 
> way to pass in to the app something like:
> spark.executor.extraClassPath      /path/to/some.jar



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to