ulysses-you commented on code in PR #41088:
URL: https://github.com/apache/spark/pull/41088#discussion_r1189269733


##########
sql/core/src/main/scala/org/apache/spark/sql/execution/DataSourceScanExec.scala:
##########
@@ -543,7 +561,7 @@ case class FileSourceScanExec(
         dataSchema = relation.dataSchema,
         partitionSchema = relation.partitionSchema,
         requiredSchema = requiredSchema,
-        filters = pushedDownFilters,
+        filters = dynamicallyPushedDownFilters,

Review Comment:
   We do not change the timing of trigger for subquery execution. If we call 
`execute` at `SparkPlan`, then it would always first prepare and call 
`updateResult` to executing subquery. So it seems there is no change with this 
pr. This pr only uses the result of subquery and push it down.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to