Hi Niek,

I tried to dig into the problem here, but I couldn't reproduce with
following step:
1. Add dependency '/my/path/a.jar' in GUI
2. Update '/my/path/a.jar' contents
3. go to interpreter page and click edit -> save

It would be helpful if you can point out what I missed to reproduce the

FYI, Spark interpreter process sets jars under local-repo/${interpreter_id}
to its classpath on start up so if you make changes under this directory
you will need to restart spark interpreter.

On Sat, Sep 3, 2016 at 11:35 AM tolomaus <niek.bartholom...@gmail.com>

> Hi,
> I just upgraded from zeppelin 0.5.6/spark 1.6.2 to zeppelin 0.6.1/spark
> 2.0.0 and after moving my application's jars %deps to the spark interpreter
> UI dependencies section I noticed that updated jars are not taken into
> account anymore. Instead Zeppelin continues to load the original versions
> of
> my jars.
> I see that zeppelin initially moves the jars to
> [zeppelin]/local-repo/2BXAQ6T44 and sends this location to spark. But even
> when I manually remove them there it still doesn't trigger a refresh.
> My temporary workaround is to have my CI system put the jars also to
> [zeppelin]/local-repo/2BXAQ6T44.
> I'm using a dedicated spark install in local mode.
> Regards,
> Niek.
> --
> View this message in context:
> http://apache-zeppelin-users-incubating-mailing-list.75479.x6.nabble.com/Dependency-jars-not-refreshed-after-interpreter-restart-in-Zeppelin-0-6-1-tp4035.html
> Sent from the Apache Zeppelin Users (incubating) mailing list mailing list
> archive at Nabble.com.

Reply via email to