Re: Path of jars added to a Spark Job - spark-submit // // Override jars in spark submit

2020-11-12 Thread Dominique De Vito
Thanks Mich To be sure, are you really saying that, using the option "spark.yarn.archive", YOU have been able to OVERRIDE installed Spark JAR with the JAR given with the option "spark.yarn.archive" ? No more than "spark.yarn.archive" ? Thanks Dominique Le jeu. 12 nov. 2020 à 18:01, Mich

Re: Path of jars added to a Spark Job - spark-submit // // Override jars in spark submit

2020-11-12 Thread Dominique De Vito
Thanks Russell > Since the driver is responsible for moving jars specified in --jars, you cannot use a jar specified by --jars to be in driver-class-path, since the driver is already started and it's classpath is already set before any jars are moved. Your point is interesting, however I see

Re: Path of jars added to a Spark Job - spark-submit // // Override jars in spark submit

2020-11-12 Thread Mich Talebzadeh
As I understand Spark expects the jar files to be available on all nodes or if applicable on HDFS directory Putting Spark Jar files on HDFS In Yarn mode, *it is important that Spark jar files are available throughout the Spark cluster*. I have spent a fair bit of time on this and I recommend

Re: Path of jars added to a Spark Job - spark-submit // // Override jars in spark submit

2020-11-12 Thread Russell Spitzer
--driver-class-path does not move jars, so it is dependent on your Spark resource manager (master). It is interpreted literally so if your files do not exist in the location you provide relative where the driver is run, they will not be placed on the classpath. Since the driver is responsible for

Path of jars added to a Spark Job - spark-submit // // Override jars in spark submit

2020-11-12 Thread Dominique De Vito
Hi, I am using Spark 2.1 (BTW) on YARN. I am trying to upload JAR on YARN cluster, and to use them to replace on-site (alreading in-place) JAR. I am trying to do so through spark-submit. One helpful answer

Re: Override jars in spark submit

2019-06-19 Thread Keith Chapman
traclasspath the jar file needs to be present on all the executors. Regards, Keith. http://keith-chapman.com On Wed, Jun 19, 2019 at 8:57 PM naresh Goud wrote: > Hello All, > > How can we override jars in spark submit? > We have hive-exec-spark jar which is available as part of de

Override jars in spark submit

2019-06-19 Thread naresh Goud
Hello All, How can we override jars in spark submit? We have hive-exec-spark jar which is available as part of default spark cluster jars. We wanted to override above mentioned jar in spark submit with latest version jar. How do we do that ? Thank you, Naresh -- Thanks, Naresh www.linkedin.com