aokolnychyi commented on a change in pull request #30562:
URL: https://github.com/apache/spark/pull/30562#discussion_r534038282



##########
File path: 
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsDelete.java
##########
@@ -28,6 +28,25 @@
  */
 @Evolving
 public interface SupportsDelete {
+
+  /**
+   * Checks whether it is possible to delete data from a data source table 
that matches filter
+   * expressions.
+   * <p>
+   * Rows should be deleted from the data source iff all of the filter 
expressions match.
+   * That is, the expressions must be interpreted as a set of filters that are 
ANDed together.
+   * <p>
+   * Spark will call this method to check if the delete is possible without 
significant effort.

Review comment:
       I've updated the PR description with more information from my comment 
above.
   
   The design doc is being prepared. It felt to me that the community has 
enough confidence that we will need such a check at planning time (maybe, I 
misunderstand concepts and expectations behind `SupportsDelete`). In my view,  
this check was pretty independent of the design of the row-level API and that's 
why I submitted it now.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to