[
https://issues.apache.org/jira/browse/SPARK-2098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14059700#comment-14059700
]
Guoqiang Li commented on SPARK-2098:
------------------------------------
PR: https://github.com/apache/spark/pull/1256
> All Spark processes should support spark-defaults.conf, config file
> -------------------------------------------------------------------
>
> Key: SPARK-2098
> URL: https://issues.apache.org/jira/browse/SPARK-2098
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 1.0.0
> Reporter: Marcelo Vanzin
> Assignee: Guoqiang Li
>
> SparkSubmit supports the idea of a config file to set SparkConf
> configurations. This is handy because you can easily set a site-wide
> configuration file, and power users can use their own when needed, or resort
> to JVM properties or other means of overriding configs.
> It would be nice if all Spark processes (e.g. master / worker / history
> server) also supported something like this. For daemon processes this is
> particularly interesting because it makes it easy to decouple starting the
> daemon (e.g. some /etc/init.d script packaged by some distribution) from
> configuring that daemon. Right now you have to set environment variables to
> modify the configuration of those daemons, which is not very friendly to the
> above scenario.
--
This message was sent by Atlassian JIRA
(v6.2#6252)