Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1256#discussion_r14668309
--- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
@@ -36,20 +38,27 @@ import scala.collection.mutable.HashMap
* Note that once a SparkConf object is passed to Spark, it is cloned and
can no longer be modified
* by the user. Spark does not support modifying the configuration at
runtime.
*
- * @param loadDefaults whether to also load values from Java system
properties
+ * @param loadDefaults whether to also load values from Java system
properties, file and resource
+ * @param fileName load properties from file
--- End diff --
You need to document above what's the order of precedence for options. From
the code, system properties > file > resource.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---