Based on my testing, that *isn't* the case, settings in interpreter.json were 
not overridden by env vars.

Let me double-check and write up a recipe and get back to this.

That said, though, what would *you* say would be a good way to do what I want?

> On Nov 8, 2015, at 19:13, Felix Cheung <felixcheun...@hotmail.com> wrote:
> 
> Thank you for your detailed and thoughtful mail.
> 
> As you shall see,
> https://github.com/apache/incubator-zeppelin/blob/master/spark/src/main/java/org/apache/zeppelin/spark/SparkInterpreter.java#L354
> 
> Env var overrides property setting, so if env var is set in zeppelin-env.sh 
> then the user setting the property on the Interpreter page would have no 
> effect. So that behavior seems to be opposite to what you want?
> 
> _____________________________
> From: Craig Ching <craigch...@gmail.com>
> Sent: Sunday, November 8, 2015 3:10 PM
> Subject: Setting defaults and allowing users to override the defaults (was 
> Re: Setting spark.app.name?)
> To: <users@zeppelin.incubator.apache.org>
> 
> 
> Hey all, 
> 
> I sent my PR that allows you to configure spark.app.name, but it was 
> rejected.  So I’m going to start a conversation on how to solve my problem in 
> a general way. 
> 
> I am running a service that allows users to start zeppelin instances so they 
> can create their own notebooks.  I want to use this to expose spark (and 
> consequently zeppelin) to a wider audience in my organization.  I have 
> created a separate web UI that allows users to “spin up” their own zeppelin 
> instance and I have some tutorials they can work through that exercise 
> different parts of spark so they can learn about it.  I run a separate spark 
> cluster because I imagine I might get quite a few users and running, say, 100 
> zeppelin+spark instances on a single machine just doesn’t sound like a great 
> idea. 
> 
> As a zeppelin site administrator, I want to be able to specify default 
> settings for new users, but I want to allow users to change those settings as 
> they become more aware of spark and zeppelin.  For instance, they may want to 
> run their own spark cluster and run against that. 
> 
> So initially I want to be able to set spark master to my cluster and I want 
> to be able to specify unique spark app names so that the applications are 
> identifiable in the cluster.  But I want users to be able to change those 
> settings later as they become more comfortable with spark and zeppelin. 
> 
> Just some quick comments, I don’t yet understand the need for both 
> zeppelin-env.sh and zeppelin-site.xml.  I would actually rather have defaults 
> specified in zeppelin-site.xml as I’m not a huge fan of having a ton of 
> settings via environment variables.  I don’t understand the current use of 
> zeppelin-site.xml, but could be used as a “first run defaults” like I need 
> for my use case? 
> 
> So, any thoughts? 
> 
> Cheers, 
> Craig
> 
> 

Reply via email to