rdblue commented on a change in pull request #25955: [SPARK-29277][SQL] Add 
early DSv2 filter and projection pushdown
URL: https://github.com/apache/spark/pull/25955#discussion_r338142621
 
 

 ##########
 File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
 ##########
 @@ -243,17 +247,36 @@ class FindDataSourceTable(sparkSession: SparkSession) 
extends Rule[LogicalPlan]
   override def apply(plan: LogicalPlan): LogicalPlan = plan resolveOperators {
     case i @ InsertIntoStatement(UnresolvedCatalogRelation(tableMeta), _, _, 
_, _)
         if DDLUtils.isDatasourceTable(tableMeta) =>
-      i.copy(table = readDataSourceTable(tableMeta))
+      if (DataSource.isV2Provider(tableMeta.provider.get, 
sparkSession.sessionState.conf)) {
 
 Review comment:
   > Yes it's currently working. But it adds a refactor and a hack to fix the 
issue introduced by the refactoring.
   
   No, it fixes an existing problem -- that rules were order dependent. That's 
not a problem introduced here.
   
   I think we should get this in as it is. This is one of the issues that we 
agreed in the community sync is important to get into the 3.0 release and that 
means it is a priority. #26214 is going to take longer to find a good solution 
and I think this one is a reasonable start.
   
   @brkyvz, what do you think?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to