[ https://issues.apache.org/jira/browse/SPARK-16194?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15348645#comment-15348645 ]
Michael Gummelt commented on SPARK-16194: ----------------------------------------- > Env variables are pretty much from outside Spark right? They're my own env vars, yea. The motivating case is setting "SSL_ENABLED" on the driver to enable mesos SSL support. > Generally, these are being removed and deprecated anyway. You mean the Spark env vars like SPARK_SUBMIT_OPTS? That's good to hear, but that's not what I'm talking about. > Any chance of just using a sys property or command line alternative? libmesos ultimately needs SSL_ENABLED, so every spark job I submit would have to convert from the sys property to the env var, which is infeasible. I realize this may be a corner case, but it would bring us to consistency with spark.executorEnv.[ENV] > No way to dynamically set env vars on driver in cluster mode > ------------------------------------------------------------ > > Key: SPARK-16194 > URL: https://issues.apache.org/jira/browse/SPARK-16194 > Project: Spark > Issue Type: Improvement > Affects Versions: 2.0.0 > Reporter: Michael Gummelt > Priority: Minor > > I often need to dynamically configure a driver when submitting in cluster > mode, but there's currently no way of setting env vars. {{spark-env.sh}} > lets me set env vars, but I have to statically build that into my spark > distribution. I need a solution for specifying them in {{spark-submit}}. > Much like {{spark.executorEnv.[ENV]}}, but for drivers. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org