I only change the content of the jar, not the name or version of the jar
(otherwise I’d have to re-add it as a dependency anyway). Or do you mean
something else by ’version’?
This dependency is a local file. Zeppelin and Spark are all running on the same
machine. So I’m just specifying the file system path of the jar; it’s not even
prefixed with file:///.
From: Jhon Anderson Cardenas Diaz [mailto:jhonderson2...@gmail.com]
Sent: 22 February 2018 12:18
Subject: EXT: Re: Jar dependencies are not reloaded when Spark interpreter is
When you say you change the dependency, is only about the content? Or content
and version. I think the dependency should be reloaded only if its version
I do not think it's optimal to re-download the dependencies every time the
El 22 feb. 2018 05:22, "Partridge, Lucas (GE Aviation)"
I’m using Zeppelin 0.7.3 against a local standalone Spark ‘cluster’. I’ve added
a Scala jar dependency to my Spark interpreter using Zeppelin’s UI. I thought
if I changed my Scala code and updated the jar (using sbt outside of Zeppelin)
then all I’d have to do is restart the interpreter for the new code to be
picked up in Zeppelin in a regular scala paragraph. However restarting the
interpreter appears to have no effect – the new code is not detected. Is that
expected behaviour or a bug?
The workaround I’m using at the moment is to edit the spark interpreter, remove
the jar, re-add it, save the changes and then restart the interpreter. Clumsy
but that’s better than restarting Zeppelin altogether.
Also, if anyone knows of a better way to reload code without restarting the
interpreter then I’m open to suggestions:). Having to re-run lots of paragraphs
after a restart is pretty tedious.