rdblue commented on a change in pull request #30562:
URL: https://github.com/apache/spark/pull/30562#discussion_r533716269



##########
File path: 
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsDelete.java
##########
@@ -28,6 +28,25 @@
  */
 @Evolving
 public interface SupportsDelete {
+
+  /**
+   * Checks whether it is possible to delete data from a data source table 
that matches filter
+   * expressions.
+   * <p>
+   * Rows should be deleted from the data source iff all of the filter 
expressions match.
+   * That is, the expressions must be interpreted as a set of filters that are 
ANDed together.
+   * <p>
+   * Spark will call this method to check if the delete is possible without 
significant effort.

Review comment:
       This comes from the documentation for `deleteWhere`:
   
   > Implementations may reject a delete operation if the delete isn't possible 
without significant effort. For example, . . .
   
   I think that phrasing is a bit more clear because it uses "implementations 
may reject", so what constitutes "significant effort" is determined by the 
implementation.
   
   I think a clearer way to say it here is to refer to that standard: "Spark 
will call this method to check whether `deleteWhere` would reject the delete 
operation because it requires significant effort."
   
   It would also help to have more context: this is for some sources to 
determine whether or not a metadata delete can be performed.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to