[
https://issues.apache.org/jira/browse/SPARK-15747?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15314181#comment-15314181
]
Terry Moschou commented on SPARK-15747:
---------------------------------------
Sorry when I meant source multiple {{spark-defaults.d/*.conf}} property files,
I meant programmatically loading, not shell script sourcing.
We use Ansible to install/configure spark and maintain our
cluster/applications/etc. It would be handy to logically separate the
installation from the configuration. For instance our Ansible plays to deploy
spark history servers, which build on our plays to install spark could simply
"drop-in" a 10-history-server.conf file with {{spark.history.*}} properties.
I understand there are a few different ways around this. E.g. Have Ansible
mutate the spark-defaults.conf, or use a drop-in {{spark-env.d/*.sh}} shell
script file (sourced by spark-env.sh) to build SPARK_HISTORY_OPTS.
But there are certainly lots of other use cases I can think of where this
feature would be useful.
> Support SPARK_CONF_DIR/spark-defaults.d/*.conf drop-in style config files
> -------------------------------------------------------------------------
>
> Key: SPARK-15747
> URL: https://issues.apache.org/jira/browse/SPARK-15747
> Project: Spark
> Issue Type: New Feature
> Reporter: Terry Moschou
>
> Feature request to automatically source all files in
> {{SPARK_CONF_DIR/spark-defaults.d/*.conf}} along with spark-defaults.conf, so
> as to enable easier maintenance and deployment of spark defaults config.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]