Thanks for pointing this out. It is impacting 0.7.0 as well.

________________________________
From: Michael Sells <mjse...@gmail.com>
Sent: Wednesday, August 17, 2016 5:46:35 AM
To: users@zeppelin.apache.org
Subject: Re: Issue loading dependency with SPARK_SUBMIT_OPTIONS w/ 0.6.1

Thanks. I'll file a Jira. Looks like custom repos aren't loading either from 
either source. I'll try to replicate as well.

Mike

On Tue, Aug 16, 2016 at 5:55 PM Mina Lee 
<mina...@apache.org<mailto:mina...@apache.org>> wrote:
I also could reproduce it with Spark 2.0.0, but not with Spark 1.6.1.
If you want to use Zeppelin with Spark 2.0, one alternative you can try is 
using [1] "dependencies" in GUI interpreter menu.

[1] http://zeppelin.apache.org/docs/0.6.1/manual/dependencymanagement.html

On Wed, Aug 17, 2016 at 1:46 AM Jeff Zhang 
<zjf...@gmail.com<mailto:zjf...@gmail.com>> wrote:
I can reproduce it in 0.6.1 & master branch, please file a ticket for that.

On Wed, Aug 17, 2016 at 4:09 AM, Michael Sells 
<mjse...@gmail.com<mailto:mjse...@gmail.com>> wrote:
Testing out 0.6.1 with Spark 2.0 and discovered the way we load dependencies 
doesn't seem to be working with the new update.

We pass new dependencies in via a SPARK_SUBMIT_OPTIONS environment variable 
pass the following flags:
--packages com.databricks:spark-avro_2.11:3.0.0

Now when I try to import it with:
import com.databricks.spark.avro._

I get:
<console>:25: error: object databricks is not a member of package com import 
com.databricks.spark.avro._

I checked the logs are there is no error retrieving the package. So it seems to 
be something with the classpath.

This works in 0.6.0. Any idea if something changed or if we're doing something 
wrong? I tried this with a few internal packages as well and it doesn't work 
with those either.

Thanks,
Mike








--
Best Regards

Jeff Zhang

Reply via email to