Hello,

How can we use *spark.jars* to to specify conflicting jars (that is, jars
that are already present in the spark's default classpath)? Jars specified
in this conf gets "appended" to the classpath, and thus gets looked at
after the default classpath. Is it not intended to be used to specify
conflicting jars?
Meanwhile when *spark.driver.extraClassPath* conf is specified, this path
is "prepended" to the classpath and thus takes precedence over the default
classpath.

How can I use both to specify different jars and paths but achieve a
precedence of spark.jars path > spark.driver.extraClassPath > spark default
classpath (left to right precedence order)?

Experiment conducted:

I am using sample-project.jar which has one class in it SampleProject. This
has a method which prints the version number of the jar. For this
experiment I am using 3 versions of this sample-project.jar
Sample-project-1.0.0.jar is present in the spark default classpath in my
test cluster
Sample-project-2.0.0.jar is present in folder /home/<user>/ClassPathConf on
driver
Sample-project-3.0.0.jar is present in  folder /home/<user>/JarsConf on
driver

(Empty cell in img below means that conf was not specified)

[image: image.png]


Thank you,
Nupur

Reply via email to