Repository: spark Updated Branches: refs/heads/master 038b18573 -> ce204780e
[SPARK-22120][SQL] TestHiveSparkSession.reset() should clean out Hive warehouse directory ## What changes were proposed in this pull request? During TestHiveSparkSession.reset(), which is called after each TestHiveSingleton suite, we now delete and recreate the Hive warehouse directory. ## How was this patch tested? Ran full suite of tests locally, verified that they pass. Author: Greg Owen <[email protected]> Closes #19341 from GregOwen/SPARK-22120. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/ce204780 Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/ce204780 Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/ce204780 Branch: refs/heads/master Commit: ce204780ee2434ff6bae50428ae37083835798d3 Parents: 038b185 Author: Greg Owen <[email protected]> Authored: Mon Sep 25 14:16:11 2017 -0700 Committer: gatorsmile <[email protected]> Committed: Mon Sep 25 14:16:11 2017 -0700 ---------------------------------------------------------------------- .../main/scala/org/apache/spark/sql/hive/test/TestHive.scala | 6 ++++++ 1 file changed, 6 insertions(+) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/ce204780/sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala ---------------------------------------------------------------------- diff --git a/sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala b/sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala index 0f6a81b..b6be00d 100644 --- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala +++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala @@ -18,6 +18,7 @@ package org.apache.spark.sql.hive.test import java.io.File +import java.net.URI import java.util.{Set => JavaSet} import scala.collection.JavaConverters._ @@ -498,6 +499,11 @@ private[hive] class TestHiveSparkSession( } } + // Clean out the Hive warehouse between each suite + val warehouseDir = new File(new URI(sparkContext.conf.get("spark.sql.warehouse.dir")).getPath) + Utils.deleteRecursively(warehouseDir) + warehouseDir.mkdir() + sharedState.cacheManager.clearCache() loadedTables.clear() sessionState.catalog.reset() --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
