Github user rdblue commented on a diff in the pull request:
https://github.com/apache/spark/pull/21122#discussion_r185138677
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/ExternalCatalog.scala
---
@@ -31,10 +30,16 @@ import org.apache.spark.util.ListenerBus
*
* Implementations should throw [[NoSuchDatabaseException]] when databases
don't exist.
*/
-abstract class ExternalCatalog
- extends ListenerBus[ExternalCatalogEventListener, ExternalCatalogEvent] {
+trait ExternalCatalog {
import CatalogTypes.TablePartitionSpec
+ // Returns the underlying catalog class (e.g., HiveExternalCatalog).
+ def unwrapped: ExternalCatalog = this
--- End diff --
Is there a better way to pass the Hive client? It looks like the uses of
`unwrapped` actually just get the Hive client from the HiveExternalCatalog. If
we can pass that through, it would prevent the need for this. I think that
would be cleaner, unless there is a problem with that I'm missing.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]