Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1256#discussion_r15597062
--- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
@@ -33,25 +35,32 @@ import scala.collection.mutable.HashMap
* All setter methods in this class support chaining. For example, you can
write
* `new SparkConf().setMaster("local").setAppName("My app")`.
*
+ * The order of precedence for options is system properties > file.
+ *
* Note that once a SparkConf object is passed to Spark, it is cloned and
can no longer be modified
* by the user. Spark does not support modifying the configuration at
runtime.
*
- * @param loadDefaults whether to also load values from Java system
properties
+ * @param loadDefaults whether to also load values from Java system
properties, file.
+ * @param fileName load properties from file
*/
-class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
+class SparkConf(loadDefaults: Boolean, fileName: Option[String])
+ extends Cloneable with Logging {
import SparkConf._
/** Create a SparkConf that loads defaults from system properties and
the classpath */
- def this() = this(true)
+ def this() = this(true, None)
+
+ /** Create a SparkConf
--- End diff --
nit: comment starts on the next line.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---