cloud-fan commented on a change in pull request #33650:
URL: https://github.com/apache/spark/pull/33650#discussion_r691294177



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/PushDownUtils.scala
##########
@@ -50,8 +49,17 @@ object PushDownUtils extends PredicateHelper {
         val translatedFilters = mutable.ArrayBuffer.empty[sources.Filter]
         // Catalyst filter expression that can't be translated to data source 
filters.
         val untranslatableExprs = mutable.ArrayBuffer.empty[Expression]
+        val dataFilters = r match {
+          case f: FileScanBuilder =>
+            val (partitionFilters, fileDataFilters) =
+              DataSourceUtils.getPartitionKeyFiltersAndDataFilters(
+              f.getSparkSession, scanBuilderHolder.relation, 
f.readPartitionSchema(), filters)
+            f.pushPartitionFilters(ExpressionSet(partitionFilters).toSeq, 
fileDataFilters)

Review comment:
       can we make it clear what's the contract?
   
   I thought we add this private API `pushPartitionFilters` for file source v2, 
which is only used to push partition filters. Individual file source v2 
implementation can implement `SupportsPushDownFilters` to get data filters.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to