Github user ckadner commented on a diff in the pull request:
https://github.com/apache/spark/pull/6983#discussion_r66765611
--- Diff:
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/util/DateTimeUtilsSuite.scala
---
@@ -48,4 +49,41 @@ class DateTimeUtilsSuite extends SparkFunSuite {
val t2 = DateTimeUtils.toJavaTimestamp(DateTimeUtils.fromJulianDay(d1,
ns1))
assert(t.equals(t2))
}
+
+ test("SPARK-6785: java date conversion before and after epoch") {
+ def checkFromToJavaDate(d1: Date): Unit = {
+ val d2 = DateTimeUtils.toJavaDate(DateTimeUtils.fromJavaDate(d1))
+ assert(d2.toString === d1.toString)
--- End diff --
No. The Date object is internally represented by a milliseconds since epoch
-- which will not match for the use cases we are testing. We only want the
year-month-day portion of the Date to match, which we get out via
Date.toString()
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]