[ 
https://issues.apache.org/jira/browse/SPARK-2098?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Or closed SPARK-2098.
----------------------------
          Resolution: Fixed
       Fix Version/s: 1.2.0
    Target Version/s: 1.2.0

> All Spark processes should support spark-defaults.conf, config file
> -------------------------------------------------------------------
>
>                 Key: SPARK-2098
>                 URL: https://issues.apache.org/jira/browse/SPARK-2098
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Marcelo Vanzin
>            Assignee: Guoqiang Li
>             Fix For: 1.2.0
>
>
> SparkSubmit supports the idea of a config file to set SparkConf 
> configurations. This is handy because you can easily set a site-wide 
> configuration file, and power users can use their own when needed, or resort 
> to JVM properties or other means of overriding configs.
> It would be nice if all Spark processes (e.g. master / worker / history 
> server) also supported something like this. For daemon processes this is 
> particularly interesting because it makes it easy to decouple starting the 
> daemon (e.g. some /etc/init.d script packaged by some distribution) from 
> configuring that daemon. Right now you have to set environment variables to 
> modify the configuration of those daemons, which is not very friendly to the 
> above scenario.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to