Repository: spark
Updated Branches:
  refs/heads/master db3b4a201 -> 69641066a


[SPARK-15037][HOTFIX] Don't create 2 SparkSessions in constructor

## What changes were proposed in this pull request?

After #12907 `TestSparkSession` creates a spark session in one of the 
constructors just to get the `SparkContext` from it. This ends up creating 2 
`SparkSession`s from one call, which is definitely not what we want.

## How was this patch tested?

Jenkins.

Author: Andrew Or <and...@databricks.com>

Closes #13031 from andrewor14/sql-test.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/69641066
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/69641066
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/69641066

Branch: refs/heads/master
Commit: 69641066ae1d35c33b082451cef636a7f2e646d9
Parents: db3b4a2
Author: Andrew Or <and...@databricks.com>
Authored: Tue May 10 12:07:47 2016 -0700
Committer: Andrew Or <and...@databricks.com>
Committed: Tue May 10 12:07:47 2016 -0700

----------------------------------------------------------------------
 .../org/apache/spark/sql/test/TestSQLContext.scala      | 12 +-----------
 1 file changed, 1 insertion(+), 11 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/69641066/sql/core/src/test/scala/org/apache/spark/sql/test/TestSQLContext.scala
----------------------------------------------------------------------
diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/test/TestSQLContext.scala 
b/sql/core/src/test/scala/org/apache/spark/sql/test/TestSQLContext.scala
index 785e345..2f247ca 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/test/TestSQLContext.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/test/TestSQLContext.scala
@@ -31,17 +31,7 @@ private[sql] class TestSparkSession(sc: SparkContext) 
extends SparkSession(sc) {
   }
 
   def this() {
-    this {
-      val conf = new SparkConf()
-      conf.set("spark.sql.testkey", "true")
-
-      val spark = SparkSession.builder
-        .master("local[2]")
-        .appName("test-sql-context")
-        .config(conf)
-        .getOrCreate()
-      spark.sparkContext
-    }
+    this(new SparkConf)
   }
 
   @transient


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to