Github user gatorsmile commented on a diff in the pull request:
https://github.com/apache/spark/pull/21122#discussion_r186143143
--- Diff:
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala
---
@@ -1354,7 +1354,8 @@ class HiveDDLSuite
val indexName = tabName + "_index"
withTable(tabName) {
// Spark SQL does not support creating index. Thus, we have to use
Hive client.
- val client =
spark.sharedState.externalCatalog.asInstanceOf[HiveExternalCatalog].client
+ val client =
+
spark.sharedState.externalCatalog.unwrapped.asInstanceOf[HiveExternalCatalog].client
--- End diff --
We want to get rid of Hive dependency completely in the near future.
Currently, in the source code, only HiveExternalCatalog needs to use/access the
`client`.
I might not get your point. Could you explain how to pass the client when
we keep the client in `HiveExternalCatalog`?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]