MaxGekk commented on a change in pull request #32107:
URL: https://github.com/apache/spark/pull/32107#discussion_r611147008



##########
File path: 
sql/core/src/test/scala/org/apache/spark/sql/DataFrameAggregateSuite.scala
##########
@@ -1110,6 +1112,14 @@ class DataFrameAggregateSuite extends QueryTest
     val e = 
intercept[AnalysisException](arrayDF.groupBy(struct($"col.a")).count())
     assert(e.message.contains("requires integral type"))
   }
+
+  test("SPARK-34716: Support ANSI SQL intervals by the aggregate function 
`sum`") {

Review comment:
       Could you test more cases:
   1. Negative tests such as overflow
   2. Aggregate nulls and non-nulls
   3. Looking at https://github.com/apache/spark/pull/26325, we will need to 
add more tests as soon as we support construction of the intervals in SQL via 
cast or `make_interval`

##########
File path: 
sql/core/src/test/scala/org/apache/spark/sql/DataFrameAggregateSuite.scala
##########
@@ -1110,6 +1112,14 @@ class DataFrameAggregateSuite extends QueryTest
     val e = 
intercept[AnalysisException](arrayDF.groupBy(struct($"col.a")).count())
     assert(e.message.contains("requires integral type"))
   }
+
+  test("SPARK-34716: Support ANSI SQL intervals by the aggregate function 
`sum`") {
+    val df = Seq((Period.ofMonths(10), Duration.ofDays(10)),
+      (Period.ofMonths(1), Duration.ofDays(1)))
+      .toDF("year-month", "day-time")
+    val sumDF = df.select(sum($"year-month"), sum($"day-time"))
+    checkAnswer(sumDF, Row(Period.ofMonths(11), Duration.ofDays(11)))

Review comment:
       You modified the `resultType` but don't check the type in schema. Could 
you add such check, please.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to