cloud-fan commented on a change in pull request #35451:
URL: https://github.com/apache/spark/pull/35451#discussion_r803303319



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/dynamicpruning/PartitionPruning.scala
##########
@@ -229,6 +229,24 @@ object PartitionPruning extends Rule[LogicalPlan] with 
PredicateHelper {
     !plan.isStreaming && hasSelectivePredicate(plan)
   }
 
+  /**
+   * Check the partition has static filter. So we can skip add dynamic 
partition pruning
+   * if there already exists.
+   */
+  private def hasStaticPartitionEqualityCondition(e: Expression, p: 
LogicalPlan): Boolean = {
+    p.find {

Review comment:
       OK. How about a different approach: we add DPP filters anyway, and then 
remove unnecessary ones in `CleanupDynamicPruningFilters`.
   
   In that rule, the predicates are already pushed dow to the source, so we can 
simply look at the predicates above scan, and remove DPP filters that has a 
corresponding static partition filters.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to