aokolnychyi commented on a change in pull request #35395:
URL: https://github.com/apache/spark/pull/35395#discussion_r815232986



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2Strategy.scala
##########
@@ -255,7 +255,13 @@ class DataSourceV2Strategy(session: SparkSession) extends 
Strategy with Predicat
     case OverwritePartitionsDynamic(r: DataSourceV2Relation, query, _, _, 
Some(write)) =>
       OverwritePartitionsDynamicExec(planLater(query), refreshCache(r), write) 
:: Nil
 
-    case DeleteFromTable(relation, condition) =>
+    case DeleteFromTableWithFilters(r: DataSourceV2Relation, filters) =>
+      DeleteFromTableExec(r.table.asDeletable, filters.toArray, 
refreshCache(r)) :: Nil
+
+    case DeleteFromTable(_, _, Some(rewritePlan)) =>
+      planLater(rewritePlan) :: Nil
+
+    case DeleteFromTable(relation, condition, None) =>

Review comment:
       When I tried this, it actually uncovered all the differences. This rule 
has very specific purpose and covers only a subset of the logic in 
`DataSourceV2Strategy`. After a closer look, I am not sure throwing exceptions 
in the optimizer rule is a great idea. Since most steps are done through common 
utils (split predicates, normalization, filter conversion), I'd consider 
keeping this rule simple. It is main purpose is to remove the rewrite plan when 
a delete can be handled using filters.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to