Repository: spark
Updated Branches:
  refs/heads/branch-2.2 9836ea19f -> b0f30b56a


[SPARK-22120][SQL] TestHiveSparkSession.reset() should clean out Hive warehouse 
directory

## What changes were proposed in this pull request?
During TestHiveSparkSession.reset(), which is called after each 
TestHiveSingleton suite, we now delete and recreate the Hive warehouse 
directory.

## How was this patch tested?
Ran full suite of tests locally, verified that they pass.

Author: Greg Owen <[email protected]>

Closes #19341 from GregOwen/SPARK-22120.

(cherry picked from commit ce204780ee2434ff6bae50428ae37083835798d3)
Signed-off-by: gatorsmile <[email protected]>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/b0f30b56
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/b0f30b56
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/b0f30b56

Branch: refs/heads/branch-2.2
Commit: b0f30b56af0563186516147d9ef296b37f679192
Parents: 9836ea1
Author: Greg Owen <[email protected]>
Authored: Mon Sep 25 14:16:11 2017 -0700
Committer: gatorsmile <[email protected]>
Committed: Mon Sep 25 14:16:25 2017 -0700

----------------------------------------------------------------------
 .../main/scala/org/apache/spark/sql/hive/test/TestHive.scala   | 6 ++++++
 1 file changed, 6 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/b0f30b56/sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala
----------------------------------------------------------------------
diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala
index ee9ac21..4612cce 100644
--- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala
+++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala
@@ -18,6 +18,7 @@
 package org.apache.spark.sql.hive.test
 
 import java.io.File
+import java.net.URI
 import java.util.{Set => JavaSet}
 
 import scala.collection.JavaConverters._
@@ -486,6 +487,11 @@ private[hive] class TestHiveSparkSession(
         }
       }
 
+      // Clean out the Hive warehouse between each suite
+      val warehouseDir = new File(new 
URI(sparkContext.conf.get("spark.sql.warehouse.dir")).getPath)
+      Utils.deleteRecursively(warehouseDir)
+      warehouseDir.mkdir()
+
       sharedState.cacheManager.clearCache()
       loadedTables.clear()
       sessionState.catalog.reset()


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to