Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/21990#discussion_r224985166
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
---
@@ -1136,4 +1121,27 @@ object SparkSession extends Logging {
SparkSession.clearDefaultSession()
}
}
+
+ /**
+ * Initialize extensions if the user has defined a configurator class in
their SparkConf.
+ * This class will be applied to the extensions passed into this
function.
+ */
+ private[sql] def applyExtensionsFromConf(conf: SparkConf, extensions:
SparkSessionExtensions) {
+ val extensionConfOption =
conf.get(StaticSQLConf.SPARK_SESSION_EXTENSIONS)
--- End diff --
I think we can even only pass
`conf.get(StaticSQLConf.SPARK_SESSION_EXTENSIONS)` as its argument instead of
`SparkConf`, and name it `applyExtensions`.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]