HyukjinKwon commented on a change in pull request #29559:
URL: https://github.com/apache/spark/pull/29559#discussion_r478460024



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -522,6 +522,15 @@ object SQLConf {
       .checkValue(_ >= 0, "The non-empty partition ratio must be positive 
number.")
       .createWithDefault(0.2)
 
+  val ADAPTIVE_OPTIMIZER_EXCLUDED_RULES =
+   buildConf("spark.sql.adaptive.optimizer.excludedRules")
+    .doc("Configures a list of rules to be disabled in the adaptive optimizer, 
in which the " +
+     "rules are specified by their rule names and separated by comma. The 
optimizer will log " +
+      "the rules that have indeed been excluded.")
+    .version("3.1.0")
+    .stringConf
+    .createOptional

Review comment:
       Should we use `fallbackConf` and use the value of 
`spark.sql.optimizer.excludedRules` by default?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to