cloud-fan commented on a change in pull request #30289:
URL: https://github.com/apache/spark/pull/30289#discussion_r528848795
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/command/views.scala
##########
@@ -361,11 +379,38 @@ object ViewHelper {
}
}
+ /**
+ * Convert the view query SQL configs in `properties`.
+ */
+ private def generateQuerySQLConfigs(conf: SQLConf): Map[String, String] = {
+ val modifiedConfs = conf.getAllConfs.filter { case (k, _) =>
+ conf.isModifiable(k) && !isConfigBlacklisted(k)
+ }
+ val props = new mutable.HashMap[String, String]
+ if (modifiedConfs.nonEmpty) {
+ val confJson = compact(render(JsonProtocol.mapToJson(modifiedConfs)))
+ props.put(VIEW_QUERY_SQL_CONFIGS, confJson)
Review comment:
have you stress-tested this? The hive metastore has a limitation about
property value length. You can take a look at
`HiveExternalCatalog.tableMetaToTableProps`.
Another idea is to put one config per table property entry.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]