Github user chu11 commented on the pull request:
https://github.com/apache/spark/pull/1059#issuecomment-50791871
I wouldn't say "multiple" conf directories, but alternate ones from the
default. In Hadoop I can stick all my config files in a /tmp/foo directory,
set HADOOP_CONF_DIR, and hadoop will read all of its configuration files out of
there instead of its default location.
spark-env.sh is already searched for in SPARK_CONF_DIR via
load-spark-env.sh, so that isn't a problem. However, spark-defaults.conf is
not searched for in SPARK_CONF_DIR. So I can't put all the config files in one
directory.
I hope that clarifies things?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---