MaxGekk commented on code in PR #37595:
URL: https://github.com/apache/spark/pull/37595#discussion_r951064662


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/percentiles.scala:
##########
@@ -59,7 +59,8 @@ abstract class PercentileBase
 
   override lazy val dataType: DataType = {
     val resultType = child.dataType match {
-      case it: AnsiIntervalType => it
+      case _: YearMonthIntervalType => YearMonthIntervalType()
+      case _: DayTimeIntervalType => DayTimeIntervalType()

Review Comment:
   It could make sense, but the current approach is we return the default type 
(interval day to second) in many places. See `mean()` (PR's description), for 
instance. cc @srielau 
   
   @cloud-fan If you propose to change the current approach, need to modify all 
place in one shot in a separate PR. For now, this PR makes the `percentile` 
functions consistent to others.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to