Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/16696#discussion_r104232961
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala
---
@@ -773,14 +773,20 @@ case class LocalLimit(limitExpr: Expression, child:
LogicalPlan) extends UnaryNo
}
override def computeStats(conf: CatalystConf): Statistics = {
val limit = limitExpr.eval().asInstanceOf[Int]
- val sizeInBytes = if (limit == 0) {
+ val childStats = child.stats(conf)
+ if (limit == 0) {
// sizeInBytes can't be zero, or sizeInBytes of BinaryNode will also
be zero
// (product of children).
- 1
+ Statistics(
+ sizeInBytes = 1,
+ rowCount = Some(0),
+ isBroadcastable = childStats.isBroadcastable)
} else {
- (limit: Long) * output.map(a => a.dataType.defaultSize).sum
+ // The output row count of LocalLimit should be the sum of row count
from each partition, but
+ // since the partition number is not available here, we just use
statistics of the child
+ // except column stats, because we don't know the distribution after
a limit operation
--- End diff --
but I think the max/min should still be corrected?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]