singhpk234 opened a new pull request, #5860: URL: https://github.com/apache/iceberg/pull/5860
### About the change Solves https://github.com/apache/iceberg/issues/5104 Presently when `spark.sql.datetime.java8API.enabled` is set to true, - java.time.LocalDate is returned for Spark SQL DATE type - java.time.Instant is returned for Spark SQL TIMESTAMP type so always casting date type to java.sql.Date or timestamp to java.sql.Timestamp is not correct and lead to class cast exception. This change attempts to use appropriate api's from DateTimeUtils class which take these scenarios into consideration and calls the relevant api's as per the object type. ```scala /** * Converts an Java object to days. * * @param obj Either an object of `java.sql.Date` or `java.time.LocalDate`. * @return The number of days since 1970-01-01. */ def anyToDays(obj: Any): Int = obj match { case d: Date => fromJavaDate(d) case ld: LocalDate => localDateToDays(ld) } /** * Converts an Java object to microseconds. * * @param obj Either an object of `java.sql.Timestamp` or `java.time.Instant`. * @return The number of micros since the epoch. */ def anyToMicros(obj: Any): Long = obj match { case t: Timestamp => fromJavaTimestamp(t) case i: Instant => instantToMicros(i) } ``` ---- ### Testing Done Added an UT which would fail without this change. cc @rdblue @aokolnychyi -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
