Repository: spark
Updated Branches:
  refs/heads/branch-1.0 530bdf7d4 -> 8e13ab2fe


SPARK-1833 - Have an empty SparkContext constructor.

This is nicer than relying on new SparkContext(new SparkConf())

Author: Patrick Wendell <[email protected]>

Closes #774 from pwendell/spark-context and squashes the following commits:

ef9f12f [Patrick Wendell] SPARK-1833 - Have an empty SparkContext constructor.
(cherry picked from commit 65533c7ec03e7eedf5cd9756822863ab6f034ec9)

Signed-off-by: Patrick Wendell <[email protected]>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/8e13ab2f
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/8e13ab2f
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/8e13ab2f

Branch: refs/heads/branch-1.0
Commit: 8e13ab2fe25d2fd50ee84a42f0f2d248432c7734
Parents: 530bdf7
Author: Patrick Wendell <[email protected]>
Authored: Wed May 14 12:53:30 2014 -0700
Committer: Patrick Wendell <[email protected]>
Committed: Wed May 14 12:53:42 2014 -0700

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/SparkContext.scala | 6 ++++++
 1 file changed, 6 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/8e13ab2f/core/src/main/scala/org/apache/spark/SparkContext.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala 
b/core/src/main/scala/org/apache/spark/SparkContext.scala
index 032b3d7..634c10c 100644
--- a/core/src/main/scala/org/apache/spark/SparkContext.scala
+++ b/core/src/main/scala/org/apache/spark/SparkContext.scala
@@ -67,6 +67,12 @@ class SparkContext(config: SparkConf) extends Logging {
   private[spark] var preferredNodeLocationData: Map[String, Set[SplitInfo]] = 
Map()
 
   /**
+   * Create a SparkContext that loads settings from system properties (for 
instance, when
+   * launching with ./bin/spark-submit).
+   */
+  def this() = this(new SparkConf())
+
+  /**
    * :: DeveloperApi ::
    * Alternative constructor for setting preferred locations where Spark will 
create executors.
    *

Reply via email to