dongjoon-hyun commented on a change in pull request #29726:
URL: https://github.com/apache/spark/pull/29726#discussion_r602034890



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -307,6 +307,17 @@ object SQLConf {
       .booleanConf
       .createWithDefault(true)
 
+  val DYNAMIC_PARTITON_PRUNING_PRUNING_SIDE_EXTRA_FILTER_RATIO =
+    
buildConf("spark.sql.optimizer.dynamicPartitionPruning.pruningSideExtraFilterRatio")
+    .internal()
+    .doc("When filtering side doesn't support broadcast by join type, and 
doing DPP means " +
+      "running an extra query that may have significant overhead. This config 
will be used " +
+      "as the extra filter ratio for computing the data size of the pruning 
side after DPP, " +
+      "in order to evaluate if it is worth adding an extra subquery as the 
pruning filter.")
+    .version("3.2.0")
+    .doubleConf

Review comment:
       Can we have `.checkValue` for the expected range, `0.0 <=` and `<= 1.0`? 
Or, are we allowing more possibility like `2.0`?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to