Number 2 under http://zeppelin.apache.org/docs/0.8.2/interpreter/spark.html is
the best guide. spark.jars.packages can be set on the interpreter. I had
to add export SPARK_SUBMIT_OPTIONS="--repositories " to
zeppelin-env.sh to add my repo to the mix
On Fri, Nov 8, 2019 at 5:11 AM Anton Kulaga
Are there clear instructions how to use spark.jars.packages properties?
For instance, if I want to depend on bintray repo
https://dl.bintray.com/comp-bio-aging/main with
"group.research.aging:spark-extensions_2.11:0.0.7.2" as a dependency, what
should I do with newintepreter?
On 2019/10/12
Glad to hear that.
Mark Bidewell 于2019年10月12日周六 上午1:30写道:
> Just wanted to say "thanks"! Using spark.jars.packages, etc worked great!
>
> On Fri, Oct 11, 2019 at 9:45 AM Jeff Zhang wrote:
>
>> That's right, document should also be updated
>>
>> Mark Bidewell 于2019年10月11日周五 下午9:28写道:
>>
>>>
Just wanted to say "thanks"! Using spark.jars.packages, etc worked great!
On Fri, Oct 11, 2019 at 9:45 AM Jeff Zhang wrote:
> That's right, document should also be updated
>
> Mark Bidewell 于2019年10月11日周五 下午9:28写道:
>
>> Also the interpreter setting UI is still listed as the first way to
>>
It looks like many users still get used to specify spark dependencies in
interpreter setting UI, spark.jars and spark.jars.packages seems too
difficult to understand and not transparent, so I create ticket
https://issues.apache.org/jira/browse/ZEPPELIN-4374 that user can still set
dependencies in
Like I said above, try to set them via spark.jars and spark.jars.packages.
Don't set them here
[image: image.png]
Mark Bidewell 于2019年10月11日周五 上午9:35写道:
> I was specifying them in the interpreter settings in the UI.
>
> On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote:
>
>> How do you
I was specifying them in the interpreter settings in the UI.
On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang wrote:
> How do you specify your spark interpreter dependencies ? You need to
> specify it via property spark.jars or spark.jars.packages for non-local
> model.
>
> Mark Bidewell
How do you specify your spark interpreter dependencies ? You need to
specify it via property spark.jars or spark.jars.packages for non-local
model.
Mark Bidewell 于2019年10月11日周五 上午3:45写道:
> I am running some initial tests of Zeppelin 0.8.2 and I am seeing some
> weird issues with dependencies.
I am running some initial tests of Zeppelin 0.8.2 and I am seeing some
weird issues with dependencies. When I use the old interpreter, everything
works as expected. When I use the new interpreter, classes in my
interpreter dependencies cannot be resolved when connecting to a master
that is not