Hi,

This is the list of the jars you use in your job, the driver will send all
those jars to each worker (otherwise the workers won't have the classes you
need in your job). The easy way to go is to build a fat jar with your code
and all the libs you depend on and then use this utility to get the path:
SparkContext.jarOfClass(YourJob.getClass)


2014/1/2 Aureliano Buendia <[email protected]>

> Hi,
>
> I do not understand why spark context has an option for loading jars at
> runtime.
>
> As an example, consider 
> this<https://github.com/apache/incubator-spark/blob/50fd8d98c00f7db6aa34183705c9269098c62486/examples/src/main/scala/org/apache/spark/examples/BroadcastTest.scala#L36>
> :
>
> object BroadcastTest {
>   def main(args: Array[String]) {
>
>   val sc = new SparkContext(args(0), "Broadcast Test",
>       System.getenv("SPARK_HOME"), Seq(System.getenv("SPARK_EXAMPLES_JAR")))
>
> }
> }
>
>
> This is *the* example, or *the* application that we want to run, what does 
> SPARK_EXAMPLES_JAR supposed to be?
> In this particular case, the BroadcastTest example is self-contained, why 
> would it want to load other unrelated example jars?
>
> Finally, how does this help a real world spark application?
>
>

Reply via email to