Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/21910#discussion_r205996797
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/Average.scala
---
@@ -57,10 +57,9 @@ abstract class AverageLike(child: Expression) extends
DeclarativeAggregate {
// If all input are nulls, count will be 0 and we will get null after
the division.
override lazy val evaluateExpression = child.dataType match {
- case DecimalType.Fixed(p, s) =>
- // increase the precision and scale to prevent precision loss
- val dt = DecimalType.bounded(p + 14, s + 4)
- Cast(Cast(sum, dt) / Cast(count,
DecimalType.bounded(DecimalType.MAX_PRECISION, 0)),
+ case _: DecimalType =>
+ Cast(
+ DecimalPrecision.decimalAndDecimal.lift(sum / Cast(count,
DecimalType.LongDecimal)).get,
--- End diff --
nit: can we just call `apply` instead of `lift(...).get`?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]