Hi Patrick,
You can set SPARK_HOME in your interpreter tab actually. I, however, cannot
test this feature well and I'll submit a PR for setting envs as well as
properties from interpreter tab. I think you want to use different version
of Spark with multiple Spark setting. Can you please file a
Easier solution: creates different instances of Spark interpreter for each
use case:
1) For embedded Spark, just let the master property to local[*]
2) For system provided Spark, edit the Spark interpreter settings and
change the master to some spark://:7077
On Mon, Aug 8, 2016 at 9:52 AM,
Hello Zeppelin users,
I was looking to configure Zeppelin so that it uses embedded Spark for some
notebooks but uses system provided Spark for others.
However it seems that the SPARK_HOME is a global parameter in zeppelin-env.sh.
Is it possible to overwrite this setting at notebook level?