In our separate environments we run it with spark-submit, so I can give
that a try.

But we run unit tests differently in our build environment, which is
throwing the error. It's setup like this:

        helper = new CassandraHelper(settings.getCassandra().get());
        SparkConf sparkConf = getCassSparkConf(helper);
        sparkConf.setMaster("local[*]");
        sparkConf.setAppName("TEST");
        sparkConf.set("spark.driver.allowMultipleContexts", "true");

        sc = new JavaSparkContext(sparkConf);

I suspect this is what you were referring to when you said I have a problem?



On 6 October 2015 at 15:40, Marcelo Vanzin <van...@cloudera.com> wrote:

> On Tue, Oct 6, 2015 at 5:57 AM, oggie <gog...@gmail.com> wrote:
> > We have a Java app written with spark 1.3.1. That app also uses Jersey
> 2.9
> > client to make external calls.  We see spark 1.4.1 uses Jersey 1.9.
>
> How is this app deployed? If it's run via spark-submit, you could use
> "spark.{driver,executor}.userClassPathFirst" to make your app use
> jersey 2.9 while letting Spark use the older jersey.
>
> If you're somehow embedding Spark and running everything in the same
> classloader, then you have a problem.
>
> --
> Marcelo
>

Reply via email to