zml1206 opened a new pull request, #46499:
URL: https://github.com/apache/spark/pull/46499

   ### What changes were proposed in this pull request?
   Avoid push down predicate if non-cheap expression exceed reused limit. Push 
down predicate through project/aggregate need replace expression, if the 
expression is non-cheap and reused many times, the cost of repeated 
calculations may be greater than the benefits of pushdown predicates.
   
   
   ### Why are the changes needed?
   Like #33958, to avoid performance regression and larger plans such as case 
when nested, the difference is that push down will have additional benefits, so 
the limit of reused count is `Int.MaxValue` instead of 1.
   
   
   ### Does this PR introduce _any_ user-facing change?
   No.
   
   
   ### How was this patch tested?
   Unit test.
   
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to