Re: Which directory contains third party libraries for Spark
Hey Stephen, In case these libraries exist on the client as a form of maven library, you can use --packages to ship the library and all it's dependencies, without building an uber jar. Best, Burak On Tue, Jul 28, 2015 at 10:23 AM, Marcelo Vanzin wrote: > Hi Stephen, > > There is no such directory currently. If you want to add an existing jar > to every app's classpath, you need to modify two config values: > spark.driver.extraClassPath and spark.executor.extraClassPath. > > On Mon, Jul 27, 2015 at 10:22 PM, Stephen Boesch > wrote: > >> when using spark-submit: which directory contains third party libraries >> that will be loaded on each of the slaves? I would like to scp one or more >> libraries to each of the slaves instead of shipping the contents in the >> application uber-jar. >> >> Note: I did try adding to $SPARK_HOME/lib_managed/jars. But the >> spark-submit still results in a ClassNotFoundException for classes included >> in the added library. >> >> > > > -- > Marcelo >
Re: Which directory contains third party libraries for Spark
Hi Stephen, There is no such directory currently. If you want to add an existing jar to every app's classpath, you need to modify two config values: spark.driver.extraClassPath and spark.executor.extraClassPath. On Mon, Jul 27, 2015 at 10:22 PM, Stephen Boesch wrote: > when using spark-submit: which directory contains third party libraries > that will be loaded on each of the slaves? I would like to scp one or more > libraries to each of the slaves instead of shipping the contents in the > application uber-jar. > > Note: I did try adding to $SPARK_HOME/lib_managed/jars. But the > spark-submit still results in a ClassNotFoundException for classes included > in the added library. > > -- Marcelo
Re: Which directory contains third party libraries for Spark
Can you show us the snippet of the exception stack ? Thanks > On Jul 27, 2015, at 10:22 PM, Stephen Boesch wrote: > > when using spark-submit: which directory contains third party libraries that > will be loaded on each of the slaves? I would like to scp one or more > libraries to each of the slaves instead of shipping the contents in the > application uber-jar. > > Note: I did try adding to $SPARK_HOME/lib_managed/jars. But the > spark-submit still results in a ClassNotFoundException for classes included > in the added library. > - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org