This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch branch-3.3
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.3 by this push:
new 1617eaded43 [SPARK-38391][SPARK-38768][SQL][FOLLOWUP] Add comments for
`pushLimit` and `pushTopN` of `PushDownUtils`
1617eaded43 is described below
commit 1617eaded434069a38cd26cb1335d3fea2501bb0
Author: Jiaan Geng <[email protected]>
AuthorDate: Sun Apr 10 20:36:58 2022 -0700
[SPARK-38391][SPARK-38768][SQL][FOLLOWUP] Add comments for `pushLimit` and
`pushTopN` of `PushDownUtils`
### What changes were proposed in this pull request?
`pushLimit` and `pushTopN` of `PushDownUtils` returns tuple of boolean. It
will be good to explain what the boolean value represents.
### Why are the changes needed?
Make DS V2 API more friendly to developers.
### Does this PR introduce _any_ user-facing change?
'No'.
Just update comments.
### How was this patch tested?
N/A
Closes #36092 from beliefer/SPARK-38391_SPARK-38768_followup.
Authored-by: Jiaan Geng <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit c4397cb3dee4f9fa16297c224da15475b2d5a297)
Signed-off-by: Dongjoon Hyun <[email protected]>
---
.../spark/sql/execution/datasources/v2/PushDownUtils.scala | 12 ++++++++++--
1 file changed, 10 insertions(+), 2 deletions(-)
diff --git
a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/PushDownUtils.scala
b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/PushDownUtils.scala
index f72310b5d7a..862189ed3af 100644
---
a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/PushDownUtils.scala
+++
b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/PushDownUtils.scala
@@ -116,7 +116,11 @@ object PushDownUtils extends PredicateHelper {
}
/**
- * Pushes down LIMIT to the data source Scan
+ * Pushes down LIMIT to the data source Scan.
+ *
+ * @return the tuple of Boolean. The first Boolean value represents whether
to push down, and
+ * the second Boolean value represents whether to push down
partially, which means
+ * Spark will keep the Limit and do it again.
*/
def pushLimit(scanBuilder: ScanBuilder, limit: Int): Boolean = {
scanBuilder match {
@@ -127,7 +131,11 @@ object PushDownUtils extends PredicateHelper {
}
/**
- * Pushes down top N to the data source Scan
+ * Pushes down top N to the data source Scan.
+ *
+ * @return the tuple of Boolean. The first Boolean value represents whether
to push down, and
+ * the second Boolean value represents whether to push down
partially, which means
+ * Spark will keep the Sort and Limit and do it again.
*/
def pushTopN(
scanBuilder: ScanBuilder,
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]