cloud-fan commented on a change in pull request #28534:
URL: https://github.com/apache/spark/pull/28534#discussion_r426622092
##########
File path:
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/DateExpressionsSuite.scala
##########
@@ -1146,4 +1146,49 @@ class DateExpressionsSuite extends SparkFunSuite with
ExpressionEvalHelper {
Literal("yyyy-MM-dd'T'HH:mm:ss.SSSz")), "Fail to parse")
}
}
+
+ test("SPARK-31710:Fix millisecond and microsecond convert to timestamp in
to_timestamp") {
+ withSQLConf() {
+ checkEvaluation(
+ GetTimestamp(
Review comment:
I thought we are going to add 3 new functions: `TIMESTAMP_SECONDS`,
`TIMESTAMP_MILLIS` and `TIMESTAMP_MICROS`, to follow big query. Why do we
overload `GetTimestamp`? Passing unit string as parameter is fragile, as we
need to define/document the supported units and the behavior of invalid units.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]