–jar (ADD_JARS) is a special class loading for Spark while
–driver-class-path (SPARK_CLASSPATH) is captured by the startup scripts and
appended to classpath settings that is used to start the JVM running the
driver

You can reference
https://www.concur.com/blog/en-us/connect-tableau-to-sparksql on how the
use of --jar and --driver-class-path within the context of including the
MySQL driver and Hive connectivity.

HTH!

On Mon, Oct 20, 2014 at 5:28 PM, Chuang Liu <liuchuan...@gmail.com> wrote:

> Hi:
>
> I am using Spark 1.1, and want to add an external jars to spark-shell. I
> dig around, and found others are doing it in two ways.
>
> *Method 1*
>
> bin/spark-shell --jars "<path-to-jars>"  --master ...
>
> *Method 2*
>
> ADD_JARS=<path-to-jars> SPARK_CLASSPATH=<path-to-jars>  bin/spark-shell
> --master ...
>
> What is the difference between these two methods ? In my case, the method
> 1 does not work, while the method 2 works.
>
> Thanks.
>
> Chuang
>

Reply via email to