peter-toth commented on code in PR #36027:
URL: https://github.com/apache/spark/pull/36027#discussion_r977992658
##########
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:
##########
@@ -435,6 +439,14 @@ private[hive] class HiveClientImpl(
getRawTableOption(dbName, tableName).map(convertHiveTableToCatalogTable)
}
+ override def getCatalogAndHiveTableOption(
+ dbName: String,
+ tableName: String): Option[CatalogAndHiveTable] = withHiveState {
+ logDebug(s"Looking up $dbName.$tableName")
+ getRawTableOption(dbName, tableName)
+ .map(t => new CatalogAndHiveTableImpl(convertHiveTableToCatalogTable(t),
t))
Review Comment:
Ok, I've added this in
https://github.com/apache/spark/pull/36027/commits/c033dc22e3006688b8d1819acaddcfe49ffda125
and made the `toCatalogTable` conversion lazy.
As the `convertHiveTableToCatalogTable()` converter needs the Hive shim and
uses the `HiveTable` class I still separated the `RawHiveTable` trait and
`RawHiveTableImpl` class and the latter became an inner class of
`HiveClientImpl` to get access to the converter.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]