Repository: spark
Updated Branches:
  refs/heads/master 0b9b6b7d1 -> 3aa60282c


[SPARK-19355][SQL][FOLLOWUP][TEST] Properly recycle SparkSession on 
TakeOrderedAndProjectSuite finishes

## What changes were proposed in this pull request?

Previously in `TakeOrderedAndProjectSuite` the SparkSession will not get 
recycled when the test suite finishes.

## How was this patch tested?

N/A

Closes #22330 from jiangxb1987/SPARK-19355.

Authored-by: Xingbo Jiang <xingbo.ji...@databricks.com>
Signed-off-by: Xiao Li <gatorsm...@gmail.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/3aa60282
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/3aa60282
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/3aa60282

Branch: refs/heads/master
Commit: 3aa60282cc84d471ea32ef240ec84e5b6e3e231b
Parents: 0b9b6b7
Author: Xingbo Jiang <xingbo.ji...@databricks.com>
Authored: Tue Sep 4 09:44:42 2018 -0700
Committer: Xiao Li <gatorsm...@gmail.com>
Committed: Tue Sep 4 09:44:42 2018 -0700

----------------------------------------------------------------------
 .../org/apache/spark/sql/execution/TakeOrderedAndProjectSuite.scala | 1 +
 1 file changed, 1 insertion(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/3aa60282/sql/core/src/test/scala/org/apache/spark/sql/execution/TakeOrderedAndProjectSuite.scala
----------------------------------------------------------------------
diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/execution/TakeOrderedAndProjectSuite.scala
 
b/sql/core/src/test/scala/org/apache/spark/sql/execution/TakeOrderedAndProjectSuite.scala
index 0a1c94c..f076959 100644
--- 
a/sql/core/src/test/scala/org/apache/spark/sql/execution/TakeOrderedAndProjectSuite.scala
+++ 
b/sql/core/src/test/scala/org/apache/spark/sql/execution/TakeOrderedAndProjectSuite.scala
@@ -45,6 +45,7 @@ class TakeOrderedAndProjectSuite extends SparkPlanTest with 
SharedSQLContext {
 
   protected override def afterAll() = {
     SQLConf.get.setConf(SQLConf.LIMIT_FLAT_GLOBAL_LIMIT, 
originalLimitFlatGlobalLimit)
+    super.afterAll()
   }
 
   private def generateRandomInputData(): DataFrame = {


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to