MaxGekk commented on a change in pull request #28310:
URL: https://github.com/apache/spark/pull/28310#discussion_r419154657
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
##########
@@ -618,6 +618,22 @@ object DateTimeUtils {
instantToMicros(resultTimestamp.toInstant)
}
+ /**
+ * Add the date and the interval's months and days.
+ * Returns a date value, expressed in days since 1.1.1970.
+ *
+ * @throws DateTimeException if the result exceeds the supported date range
+ * @throws IllegalArgumentException if the interval has `microseconds` part
+ */
+ def dateAddInterval(
+ start: SQLDate,
+ interval: CalendarInterval): SQLDate = {
+ require(interval.microseconds == 0,
+ "Cannot add hours, minutes or seconds, milliseconds, microseconds to a
date")
+ val ld =
LocalDate.ofEpochDay(start).plusMonths(interval.months).plusDays(interval.days)
Review comment:
I see, thanks. It would be nice to document such behavior of this
function and timestampAddInterval somewhere. It is not obvious that we add
month then days and then micros. The order could be opposite.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]