[ 
https://issues.apache.org/jira/browse/SPARK-1089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14056551#comment-14056551
 ] 

Nicholas Chammas commented on SPARK-1089:
-----------------------------------------

So going forward, is the correct procedure for adding external jars to set 
{{SPARK_CLASSPATH}}? Is there a doc somewhere that details this process? And 
what is the difference between setting these environment variables and calling 
{{sc.addJar()}} from within the shell?

> ADD_JARS regression in Spark 0.9.0
> ----------------------------------
>
>                 Key: SPARK-1089
>                 URL: https://issues.apache.org/jira/browse/SPARK-1089
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 0.9.0
>            Reporter: Andrew Ash
>            Assignee: Nan Zhu
>            Priority: Blocker
>             Fix For: 0.9.1, 1.0.0
>
>
> Using the ADD_JARS environment variable with spark-shell used to add the jar 
> to both the shell and the various workers.  Now it only adds to the workers 
> and importing a custom class in the shell is broken.
> The workaround is to add custom jars to both ADD_JARS and SPARK_CLASSPATH.
> We should fix ADD_JARS so it works properly again.
> See various threads on the user list:
> https://mail-archives.apache.org/mod_mbox/incubator-spark-user/201402.mbox/%3ccajbo4nemlitrnm1xbyqomwmp0m+eucg4ye-txurgsvkob5k...@mail.gmail.com%3E
> (another one that doesn't appear in the archives yet titled "ADD_JARS not 
> working on 0.9")



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to