Albert Chu created SPARK-2116:
---------------------------------

             Summary: Load spark-defaults.conf from directory specified by 
SPARK_CONF_DIR
                 Key: SPARK-2116
                 URL: https://issues.apache.org/jira/browse/SPARK-2116
             Project: Spark
          Issue Type: Improvement
          Components: Deploy
    Affects Versions: 1.0.0
            Reporter: Albert Chu
            Priority: Minor
         Attachments: SPARK-2116.patch

Presently, spark-defaults.conf is loaded from 
SPARK_HOME/conf/spark-defaults.conf.  As far as I can tell, the only way to 
specify an alternate one is to specify one on the command line via spark-submit.

It would be convenient to have an environment variable to specify a constant 
alternate spark-defaults.conf.  Via SPARK_CONF_DIR would be convenient, similar 
to HADOOP_CONF_DIR in Hadoop.

Patch will be attached, github pull request will also be sent.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to