SparkContext.addJar()?

Why you didn't like fat jar way?

2014-09-25 16:25 GMT+04:00 rzykov <rzy...@gmail.com>:

> We build some SPARK jobs with external jars. I compile jobs by including
> them
> in one assembly.
> But look for an approach to put all external jars into HDFS.
>
> We have already put  spark jar in a HDFS folder and set up the variable
> SPARK_JAR.
> What is the best way to do that for other external jars (MongoDB, algebird
> and so on)?
>
> Thanks in advance
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/SPARK-1-1-0-on-yarn-cluster-and-external-JARs-tp15136.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 



*Sincerely yoursEgor PakhomovScala Developer, Yandex*

Reply via email to