Hi All,

I am trying to execute a sample spark script ( that use spark jdbc ) which has 
dependencies on a set of custom jars. These custom jars need to be added first 
in the classpath. Currently, I have copied custom lib directory to all the 
nodes and able to execute it with below command.


bin/spark-shell  --conf spark.driver.extraClassPath=/custom-jars/* --conf 
"spark.driver.userClassPathFirst=true" --conf 
spark.executor.extraClassPath=/custom-jars/* --conf 
"spark.executor.userClassPathFirst=true" --master yarn -i /tmp/spark-test.scala


Are there any options that do not require jars to be copied to all the nodes 
(with option to be added to class path first) ? The --jars and --archives 
option seems to not working for me. Any suggestion would be appreciated!


Thanks,


Arjun


Reply via email to