Github user rdblue commented on a diff in the pull request:
https://github.com/apache/spark/pull/21122#discussion_r186205362
--- Diff:
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala
---
@@ -1354,7 +1354,8 @@ class HiveDDLSuite
val indexName = tabName + "_index"
withTable(tabName) {
// Spark SQL does not support creating index. Thus, we have to use
Hive client.
- val client =
spark.sharedState.externalCatalog.asInstanceOf[HiveExternalCatalog].client
+ val client =
+
spark.sharedState.externalCatalog.unwrapped.asInstanceOf[HiveExternalCatalog].client
--- End diff --
@gatorsmile, what is the reason for passing the client as a field of
HiveExternalCatalog and not on its own? This requires casting the catalog to
access the client, and there doesn't appear to be an obvious reason not to pass
the client separately.
Given that this is a problem -- the proposed interface was returning a
wrapped catalog -- I'm trying to understand why it is this way and whether it
should be changed.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]