Hi Chris,

there are several ways to load dependencies to Zeppelin 0.5.5.
Using %dep is one of them.
If you want do it by setting spark.jars.packages property, proper way of
doing it is editing your SPARK_HOME/conf/spark-default.conf
and adding below line.(I assume that you set SPARK_HOME in
ZEPPELIN_HOME/conf/zeppelin-env.sh)

spark.jars.packages   org.apache.avro:avro:1.8.0,org.joda:joda-convert:1.8.1

The reason you can import avro dependency is that spark assembly already
includes avro dependencies, not because you added it in Zeppelin
interpreter setting.

You can add dependencies via GUI with the latest master
branch(0.6.0-incubating-SNAPSHOT) which is experimental at the moment.
Please let me know it answers your question.

Regards,
Mina

On Wed, Mar 9, 2016 at 1:41 AM Chris Miller <cmiller11...@gmail.com> wrote:

> Hi,
>
> I have a strange situation going on. I'm running Zeppelin 0.5.5 and Spark
> 1.6.0 (on Amazon EMR). I added this property to the interpreter settings
> (and restarted it):
>
> spark.jars.packages: org.apache.avro:avro:1.8.0,org.joda:joda-convert:1.8.1
>
> The avro dependency loads fine and I'm able to import and use it. However,
> if I try to import something in the joda-convert package (such as,
> org.joda.convert.FromString), I get an error that "error: object convert is
> not a member of package org.joda".
>
> If I run the spark-shell from the CLI and include the same string above in
> the --package parameter, I'm able to import joda-convert just fine. Also,
> if I restart the interpreter and manually import the dependency with
> z.load(), it also works fine:
>
> %dep
> z.load("org.joda:joda-convert:1.8.1")
>
> So, what's going on here?
>
> --
> Chris Miller
>

Reply via email to