Hi Polina,

You can just define the SPARK_HOME one the conf/zeppelin-Envoyez.sh and get rid 
of any other Spark configuration from that file, otherwise Zeppelin will just 
overwrite them. Once this is done, you can define the Spark default 
configurations in its config file living in conf/spark.default.conf.

Cheers,
Ndjido.

> On 08 Aug 2016, at 07:31, Polina Marasanova 
> <polina.marasan...@quantium.com.au> wrote:
> 
> Hi everyone,
> 
> I have a question: in previous versions of Zeppelin all settings for 
> interpreters were stored in a file called "interpreter.json". It was very 
> convenient to provide there some default spark settings such as spark.master, 
> default driver memory and etc.
> What is the best way for version 0.6.0 to provide a bunch of defaults 
> specific to a project?
> 
> Thanks
> Cheers
> Polina Marasanova

Reply via email to