cloud-fan commented on a change in pull request #34023:
URL: https://github.com/apache/spark/pull/34023#discussion_r710681876
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/planning/patterns.scala
##########
@@ -144,13 +144,15 @@ object ScanOperation extends OperationHelper with
PredicateHelper {
case Filter(condition, child) =>
collectProjectsAndFilters(child) match {
case Some((fields, filters, other, aliases)) =>
- // Follow CombineFilters and only keep going if 1) the collected
Filters
- // and this filter are all deterministic or 2) if this filter is
the first
- // collected filter and doesn't have common non-deterministic
expressions
- // with lower Project.
+ // When collecting projects and filters, we effectively push down
filters through
+ // projects. We need to meet the following conditions to do so:
+ // 1) no Project collected so far or the collected Projects are
all deterministic
Review comment:
yea
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]