Repository: spark
Updated Branches:
  refs/heads/master a3315d7f4 -> 65533c7ec


SPARK-1833 - Have an empty SparkContext constructor.

This is nicer than relying on new SparkContext(new SparkConf())

Author: Patrick Wendell <[email protected]>

Closes #774 from pwendell/spark-context and squashes the following commits:

ef9f12f [Patrick Wendell] SPARK-1833 - Have an empty SparkContext constructor.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/65533c7e
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/65533c7e
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/65533c7e

Branch: refs/heads/master
Commit: 65533c7ec03e7eedf5cd9756822863ab6f034ec9
Parents: a3315d7
Author: Patrick Wendell <[email protected]>
Authored: Wed May 14 12:53:30 2014 -0700
Committer: Patrick Wendell <[email protected]>
Committed: Wed May 14 12:53:30 2014 -0700

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/SparkContext.scala | 6 ++++++
 1 file changed, 6 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/65533c7e/core/src/main/scala/org/apache/spark/SparkContext.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala 
b/core/src/main/scala/org/apache/spark/SparkContext.scala
index 032b3d7..634c10c 100644
--- a/core/src/main/scala/org/apache/spark/SparkContext.scala
+++ b/core/src/main/scala/org/apache/spark/SparkContext.scala
@@ -67,6 +67,12 @@ class SparkContext(config: SparkConf) extends Logging {
   private[spark] var preferredNodeLocationData: Map[String, Set[SplitInfo]] = 
Map()
 
   /**
+   * Create a SparkContext that loads settings from system properties (for 
instance, when
+   * launching with ./bin/spark-submit).
+   */
+  def this() = this(new SparkConf())
+
+  /**
    * :: DeveloperApi ::
    * Alternative constructor for setting preferred locations where Spark will 
create executors.
    *

Reply via email to