Hi Naresh,

You could use "--conf spark.driver.extraClassPath=<PATH TO JAR FILE>". Note
that the jar will not be shipped to the executors, if its a class that is
needed on the executors as well you should provide "--conf
spark.executor.extraClassPath=<PATH TO JAR FILE>". Note that if you do
provide executor extraclasspath the jar file needs to be present on all the
executors.

Regards,
Keith.

http://keith-chapman.com


On Wed, Jun 19, 2019 at 8:57 PM naresh Goud <nareshgoud.du...@gmail.com>
wrote:

> Hello All,
>
> How can we override jars in spark submit?
> We have hive-exec-spark jar which is available as part of default spark
> cluster jars.
> We wanted to override above mentioned jar in spark submit with latest
> version jar.
> How do we do that ?
>
>
> Thank you,
> Naresh
> --
> Thanks,
> Naresh
> www.linkedin.com/in/naresh-dulam
> http://hadoopandspark.blogspot.com/
>
>

Reply via email to