I only know of a way to do that with YARN.

You can distribute the jar files using "--files" and add just their
names (not the full path) to the "extraClassPath" configs. You don't
need "userClassPathFirst" in that case.

On Thu, Jun 14, 2018 at 1:28 PM, Arjun kr <arjun...@outlook.com> wrote:
> Hi All,
>
>
> I am trying to execute a sample spark script ( that use spark jdbc ) which
> has dependencies on a set of custom jars. These custom jars need to be added
> first in the classpath. Currently, I have copied custom lib directory to all
> the nodes and able to execute it with below command.
>
>
> bin/spark-shell  --conf spark.driver.extraClassPath=/custom-jars/* --conf
> "spark.driver.userClassPathFirst=true" --conf
> spark.executor.extraClassPath=/custom-jars/* --conf
> "spark.executor.userClassPathFirst=true" --master yarn -i
> /tmp/spark-test.scala
>
>
> Are there any options that do not require jars to be copied to all the nodes
> (with option to be added to class path first) ? The --jars and --archives
> option seems to not working for me. Any suggestion would be appreciated!
>
>
> Thanks,
>
>
> Arjun
>
>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to