Github user dongjoon-hyun commented on a diff in the pull request: https://github.com/apache/spark/pull/12421#discussion_r60854163 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/Optimizer.scala --- @@ -1321,7 +1321,9 @@ object DecimalAggregates extends Rule[LogicalPlan] { /** Maximum number of decimal digits representable precisely in a Double */ private val MAX_DOUBLE_DIGITS = 15 - def apply(plan: LogicalPlan): LogicalPlan = plan transformAllExpressions { + def apply(plan: LogicalPlan): LogicalPlan = plan transformExpressionsDown { + case we @ WindowExpression(AggregateExpression(_, _, _, _), _) => we --- End diff -- Oh, thank you for coming back, @hvanhovell . I tried that last week that, but it faced the following error in WindowExec.scala. (last week, it was Window.scala) ``` Unsupported window function: cast(((avg(UnscaledValue(a#14)),mode=Complete,isDistinct=false) / 10.0) as decimal(6,5)) java.lang.RuntimeException: Unsupported window function: cast(((avg(UnscaledValue(a#14)),mode=Complete,isDistinct=false) / 10.0) as decimal(6,5)) at scala.sys.package$.error(package.scala:27) at org.apache.spark.sql.execution.WindowExec$$anonfun$windowFrameExpressionFactoryPairs$1$$anonfun$apply$2.apply(WindowExec.scala:183) ``` After fixing that, there occured errors at compiling generated code (codeGen result) due to the mismatched type between input schema (Decimal to Long and vice versa). I did try the input schema or BoundedReferences, too. Actually, this observation is the same one I mentioned 9 days ago. I tried other approaches in other ways. But nothing was a clean solution for this. So, I decided to ask your help.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org