[ 
https://issues.apache.org/jira/browse/SPARK-896?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-896.
-----------------------------
    Resolution: Won't Fix

I'm gonna call this WontFix as ADD_JARS has been deprecated for a while.

> ADD_JARS does not add all classes to classpath in the spark-shell for cluster 
> on Mesos.
> ---------------------------------------------------------------------------------------
>
>                 Key: SPARK-896
>                 URL: https://issues.apache.org/jira/browse/SPARK-896
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 0.7.3
>            Reporter: Gary Malouf
>
> I do not believe the issue is limited to scheduler/executors running on Mesos 
> but added the information for debugging purposes.
> h3. Reproducing the issue:
> # Implement some custom functionalities and package them into a 'monster jar' 
> with something like sbt assembly.
> # Drop this jar onto the Spark master box and specify the path to it in the 
> ADD_JARS variable.
> # Start up the spark shell on same box as the master.  You should be able to 
> import packages/classes specified in the jar without any compilation trouble. 
>  
> # In a map function on an RDD, trying to call a class from within this jar 
> (with fully qualified name) fails on a ClassNotFoundException.
> h3. Workaround
> Matei Zaharia suggested adding this jar to the SPARK_CLASSPATH environment 
> variable - that resolved the issue.  My understanding however is that the 
> functionality should work using solely the ADD_JARS variable - the 
> documentation does not capture this.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to