[ 
https://issues.apache.org/jira/browse/SPARK-6069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14341705#comment-14341705
 ] 

Pat Ferrel commented on SPARK-6069:
-----------------------------------

I agree, that part makes me suspicious, which is why I’m not sure I trust my 
builds completely.

No the ‘app' is one of the Spark-Mahout’s CLI drivers. The jar is a 
dependencies-reduced type thing that has only scopt and guava.

In any case if I put 
-D:spark.executor.extraClassPath=/Users/pat/mahout/spark/target/mahout-spark_2.10-1.0-SNAPSHOT-dependency-reduced.jar
 on the command line, which passes the key=value to the SparkConf then the 
Mahout CLI driver it works. The test setup is a standalone localhost only 
cluster (not local[n]). It is started with sbin/start-all.sh The same jar is 
used to create the context and I’ve checked that and the contents of the jar 
quite carefully.

On Feb 28, 2015, at 10:09 AM, Sean Owen (JIRA) <[email protected]> wrote:


   [ 
https://issues.apache.org/jira/browse/SPARK-6069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14341699#comment-14341699
 ] 

Sean Owen commented on SPARK-6069:
----------------------------------

Hm, the thing is I have been successfully running an app, without spark-submit, 
with kryo, with Guava 14 just like you and have never had a problem. I can't 
figure out what the difference is here.

The kryo not-found exception is stranger still. You aren't packaging spark 
classes with your app right?




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)



> Deserialization Error ClassNotFound 
> ------------------------------------
>
>                 Key: SPARK-6069
>                 URL: https://issues.apache.org/jira/browse/SPARK-6069
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.1
>         Environment: Standalone one worker cluster on localhost, or any 
> cluster
>            Reporter: Pat Ferrel
>
> A class is contained in the jars passed in when creating a context. It is 
> registered with kryo. The class (Guava HashBiMap) is created correctly from 
> an RDD and broadcast but the deserialization fails with ClassNotFound.
> The work around is to hard code the path to the jar and make it available on 
> all workers. Hard code because we are creating a library so there is no easy 
> way to pass in to the app something like:
> spark.executor.extraClassPath      /path/to/some.jar



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to