This plan sounds great of having a script that can install individual
available interpreters. It sounds reasonable to always ship with a spark
version for now for lack of complexity in installing that.
The original list of min interpreters seems good, too. The generic jdbc
interpreter handles a lo
gt; I believe z.context() is the only way to share data between interpreters.
>> Within an interpreter data is usually available across paragraphs…perhaps
>> even across notebooks as I guess zeppelin will create a single interpreter
>> in the backend unless you somehow ma
Greetings,
I'm brand new to zeppelin, and this notebook technology looks great. I'm
evaluating using it for our data science team, but got it up and running
quickly using some PostgreSQL data and some spark tests. The distributed
nature of each paragraph, and naturally varying interpreters within