cloud-fan commented on a change in pull request #31355:
URL: https://github.com/apache/spark/pull/31355#discussion_r600804966
##########
File path:
sql/catalyst/src/main/java/org/apache/spark/sql/connector/write/RequiresDistributionAndOrdering.java
##########
@@ -42,6 +42,19 @@
*/
Distribution requiredDistribution();
+ /**
+ * Returns the number of partitions required by this write.
+ * <p>
+ * Implementations may override this to require a specific number of input
partitions.
+ * <p>
+ * Note that Spark doesn't support the number of partitions on {@link
UnspecifiedDistribution},
+ * if requiredDistribution() returns {@link UnspecifiedDistribution}, the
return value for this
+ * method will be ignored.
Review comment:
Shall we ignore or fail? It's very easy to detect it before applying the
requirement.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]