You can use %spark.conf to add custom jars to spark interpreter.
%spark.conf is more powerful that it could not customize jars and also
other spark configurations.

e.g.

%spark.conf

spark.jars /tmp/product.jar


See the Generic ConfInterpeter Section of this article
https://medium.com/@zjffdu/zeppelin-0-8-0-new-features-ea53e8810235

Yohana Khoury <yohana.kho...@gigaspaces.com>于2018年5月27日周日 下午10:22写道:

> Hi,
>
> I am trying to run the following on Zeppelin 0.8.0-RC2, Spark 2.3:
>
> *%dep*
> *z.load("/tmp/product.jar")*
>
> and then
>
> *%spark*
> *import model.v1._*
>
> The result is:
>
> *<console>:25: error: not found: value model*
> *       import model.v1._*
> *              ^*
>
>
> It appears that Zeppelin does not load the jar file into the spark
> interpreter.
> Are you dropping this option from Zeppelin 0.8.0 ? Is it going to be fixed?
>
> It worked with Zeppelin 0.7.2.
>
>
> Thanks!
>

Reply via email to