MaxGekk commented on a change in pull request #31577:
URL: https://github.com/apache/spark/pull/31577#discussion_r578166983
##########
File path:
sql/core/src/test/scala/org/apache/spark/sql/internal/SQLConfSuite.scala
##########
@@ -451,4 +451,14 @@ class SQLConfSuite extends QueryTest with
SharedSparkSession {
val e2 = intercept[ParseException](sql("set time zone interval 19 hours"))
assert(e2.getMessage contains "The interval value must be in the range of
[-18, +18] hours")
}
+
+ test("SPARK-34454: configs from the legacy namespace should be internal") {
+ val nonInternalLegacyConfigs = spark.sessionState.conf.getAllDefinedConfs
+ .filter { case (key, _, _, _) => key.contains(".legacy.") }
Review comment:
I just want to catch the cases too. Even for now, we have only
`spark.sql.legacy` in OSS. Let's imagine there is a Spark vendor which has its
own configs like `spark.databricks.legacy.*` ;-) It would be nice if the test
works with their configs too. Also, maybe in the future, someone will add a
nested `legacy` namespace, so, the test will be ready for that.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]