aokolnychyi commented on a change in pull request #30562:
URL: https://github.com/apache/spark/pull/30562#discussion_r534078654
##########
File path:
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsDelete.java
##########
@@ -28,6 +28,25 @@
*/
@Evolving
public interface SupportsDelete {
+
+ /**
+ * Checks whether it is possible to delete data from a data source table
that matches filter
+ * expressions.
+ * <p>
+ * Rows should be deleted from the data source iff all of the filter
expressions match.
+ * That is, the expressions must be interpreted as a set of filters that are
ANDed together.
+ * <p>
+ * Spark will call this method to check if the delete is possible without
significant effort.
Review comment:
If we want to return a set of rejected filters to form a better message,
that is OK for me. I am also not going to push for this change if there is no
consensus. As I wrote before, it felt that the community had enough confidence
that we would eventually need such a check at planning time based on how
`SupportsDelete` is designed. If we want to delay this conversation until
later, fine with me too.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]