cloud-fan commented on a change in pull request #30562:
URL: https://github.com/apache/spark/pull/30562#discussion_r534004705



##########
File path: 
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsDelete.java
##########
@@ -28,6 +28,25 @@
  */
 @Evolving
 public interface SupportsDelete {
+
+  /**
+   * Checks whether it is possible to delete data from a data source table 
that matches filter
+   * expressions.
+   * <p>
+   * Rows should be deleted from the data source iff all of the filter 
expressions match.
+   * That is, the expressions must be interpreted as a set of filters that are 
ANDed together.
+   * <p>
+   * Spark will call this method to check if the delete is possible without 
significant effort.

Review comment:
       My worry is that we may change the API back and forth if we don't have a 
clear big picture. Right now this patch is not useful as it only changes where 
we throw the exception, but I can see that this will be useful when we have the 
row-level delete API and we can use the `canDeleteWhere` to decide if we want 
to use the row-level API or not.
   
   Do we have a sketch about how the row-level API would look like? It seems 
weird if we merge a patch that is useful to something that is not being 
designed yet.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to