Re: Spark LOCAL mode and external jar (extraClassPath)

2018-04-14 Thread Jason Boorn
Ok great I’ll give that a shot - Thanks for all the help > On Apr 14, 2018, at 12:08 PM, Gene Pang <gene.p...@gmail.com> wrote: > > Yes, I think that is the case. I haven't tried that before, but it should > work. > > Thanks, > Gene > > On Fri, Apr 13, 2

Re: Spark LOCAL mode and external jar (extraClassPath)

2018-04-13 Thread Jason Boorn
-shell have command-line options to set the classpath > for the JVM that is being started. > > If you are not using spark-submit or spark-shell, you will have to figure out > how to configure that JVM instance with the proper properties. > > Thanks, > Gene > >

Re: Spark LOCAL mode and external jar (extraClassPath)

2018-04-13 Thread Jason Boorn
Ok thanks - I was basing my design on this: https://databricks.com/blog/2016/08/15/how-to-use-sparksession-in-apache-spark-2-0.html Wherein it says: Once the SparkSession is instantiated, you can

Re: Spark LOCAL mode and external jar (extraClassPath)

2018-04-13 Thread Jason Boorn
hat you provide correctly the jar based on its > location. I have found it tricky in some cases. > As a debug try, if the jar is not on HDFS, you can copy it there and then > specify the full path in the extraclasspath property. > Regards, > Yohann Jardin > > Le 4/13/2018

Re: Spark LOCAL mode and external jar (extraClassPath)

2018-04-13 Thread Jason Boorn
I have setup some options using `.config("Option", > "value")` when creating the spark session, and then other runtime options as > you describe above with `spark.conf.set`. At this point though I've just > moved everything out into a `spark-submit` script. > > O

Re: Spark LOCAL mode and external jar (extraClassPath)

2018-04-13 Thread Jason Boorn
Hi Geoff - Appreciate the help here - I do understand what you’re saying below. And I am able to get this working when I submit a job to a local cluster. I think part of the issue here is that there’s ambiguity in the terminology. When I say “LOCAL” spark, I mean an instance of spark that is