viirya commented on a change in pull request #30562:
URL: https://github.com/apache/spark/pull/30562#discussion_r533727569
##########
File path:
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsDelete.java
##########
@@ -28,6 +28,25 @@
*/
@Evolving
public interface SupportsDelete {
+
+ /**
+ * Checks whether it is possible to delete data from a data source table
that matches filter
+ * expressions.
+ * <p>
+ * Rows should be deleted from the data source iff all of the filter
expressions match.
+ * That is, the expressions must be interpreted as a set of filters that are
ANDed together.
+ * <p>
+ * Spark will call this method to check if the delete is possible without
significant effort.
Review comment:
> I think a clearer way to say it here is to refer to that standard:
"Spark will call this method to check whether deleteWhere would reject the
delete operation because it requires significant effort."
This sounds better as there is a standard between `canDeleteWhere` and
`deleteWhere`.
So `canDeleteWhere` is a much light-weight approach to know `deleteWhere`
will reject a delete operation without actually calling `deleteWhere`.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]