Hi,

Once you create another spark interpreter in Interpreter menu of GUI,
then each notebook should able to select and use it (setting icon on top
right corner of each notebook).

If it does not work, could you find error message on the log file?

Thanks,
moon

On Fri, Feb 5, 2016 at 11:54 AM Zhong Wang <wangzhong....@gmail.com> wrote:

> Hi zeppelin pilots,
>
> I am trying to run multiple spark interpreters in the same Zeppelin
> instance. This is very helpful if the data comes from multiple spark
> clusters.
>
> Another useful use case is that, run one instance in cluster mode, and
> another in local mode. This will significantly boost the performance of
> small data analysis.
>
> Is there anyway to run multiple spark interpreters? I tried to create
> another spark interpreter with a different identifier, which is allowed in
> UI. But it doesn't work (shall I file a ticket?)
>
> I am now trying running multiple sparkContext in the same spark
> interpreter.
>
> Zhong
>
>
>

Reply via email to