MaxGekk commented on a change in pull request #24287: [WIP][SPARK-27357][SQL]
Cast timestamps to/from dates independently from time zones
URL: https://github.com/apache/spark/pull/24287#discussion_r272824319
##########
File path:
sql/core/src/test/scala/org/apache/spark/sql/DataFrameFunctionsSuite.scala
##########
@@ -929,15 +930,17 @@ class DataFrameFunctionsSuite extends QueryTest with
SharedSQLContext {
Seq((1.toByte, 3L, 1)).toDF().select(sequence('_1, '_2, '_3)),
Seq(Row(Array(1L, 2L, 3L))))
- checkAnswer(
- spark.sql("select sequence(" +
- " cast('2018-01-01' as date)" +
- ", cast('2018-01-02 00:00:00' as timestamp)" +
- ", interval 12 hours)"),
- Seq(Row(Array(
- Timestamp.valueOf("2018-01-01 00:00:00"),
- Timestamp.valueOf("2018-01-01 12:00:00"),
- Timestamp.valueOf("2018-01-02 00:00:00")))))
+ withSQLConf(SQLConf.DATETIME_JAVA8API_ENABLED.key -> "true") {
Review comment:
I switched to Java 8 API here and in another places in the PR because it
slightly hard to control time zone related manipulations inside of
`java.sql.Timestamp`. For example, `Timestamp.valueOf` uses system default
time zone in JVM to parse its input string.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]