Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14912#discussion_r77427992
  
    --- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/FilterPushdownSuite.scala
 ---
    @@ -171,6 +172,27 @@ class FilterPushdownSuite extends PlanTest {
         comparePlans(optimized, correctAnswer)
       }
     
    +  test("push down filters that are combined") {
    +    // The following predicate ('a === 2 || 'a === 3) && ('c > 10 || 'a 
=== 2)
    +    // will be simplified as ('a == 2) || ('c > 10 && 'a == 3).
    +    // ('a === 2 || 'a === 3) can be pushed down. But the simplified one 
can't.
    --- End diff --
    
    yeah, as I mentioned in the description, this is currently a simplest to 
prevent the predicates which can be pushed down becoming not pushed down.
    
    Your case is not pushed down at the beginning.
    
    Because the optimization rules are independent, boolean simplification 
logic is just a general rule to simplify predicates, and doesn't be aware of 
the pushdown logic. Basically boolean simplification now looks good and it 
makes sense to do `(a > 10 || a < 100) && (a > 10 || b == 5)` => `(a > 10) || 
(a < 100 && b == 5)`, however, it causes the pushdown issue.
    
    Your another approach makes sense to me. I have thought about this, just 
don't know if it is necessary to come out it for this corner case, because it 
needs more code changes.
    
    If it is acceptable, I will implement it. Thank you.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to