Github user tdas commented on a diff in the pull request:
https://github.com/apache/spark/pull/16826#discussion_r103064301
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveSessionCatalog.scala ---
@@ -212,3 +247,31 @@ private[sql] class HiveSessionCatalog(
"histogram_numeric"
)
}
+
+private[sql] object HiveSessionCatalog {
+
+ def apply(
+ sparkSession: SparkSession,
+ functionResourceLoader: FunctionResourceLoader,
+ functionRegistry: FunctionRegistry,
+ conf: SQLConf,
+ hadoopConf: Configuration,
+ parser: ParserInterface): HiveSessionCatalog = {
+
+ // Catalog for handling data source tables. TODO: This really doesn't
belong here since it is
+ // essentially a cache for metastore tables. However, it relies on a
lot of session-specific
+ // things so it would be a lot of work to split its functionality
between HiveSessionCatalog
+ // and HiveCatalog. We should still do it at some point...
+ val metastoreCatalog = new HiveMetastoreCatalog(sparkSession)
+
+ new HiveSessionCatalog(
+
sparkSession.sharedState.externalCatalog.asInstanceOf[HiveExternalCatalog],
+ sparkSession.sharedState.globalTempViewManager,
+ metastoreCatalog,
+ functionResourceLoader: FunctionResourceLoader,
--- End diff --
i think i mentioned before, this is a weird way to specify params. this is
used *only* when you want to coerce the param into a particular class type to
fit the required param type. i dont think that is the case here. please remove
this if not necessary.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]