maropu commented on a change in pull request #25716: [SPARK-29012][SQL] Support
special timestamp values
URL: https://github.com/apache/spark/pull/25716#discussion_r322498620
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
##########
@@ -848,4 +852,46 @@ object DateTimeUtils {
val sinceEpoch = BigDecimal(timestamp) / MICROS_PER_SECOND + offset
new Decimal().set(sinceEpoch, 20, 6)
}
+
+ def currentTimestamp(): SQLTimestamp = instantToMicros(Instant.now())
+
+ private def today(zoneId: ZoneId): ZonedDateTime = {
+ Instant.now().atZone(zoneId).`with`(LocalTime.MIDNIGHT)
+ }
+
+ private val specialValue =
"""(EPOCH|NOW|TODAY|TOMORROW|YESTERDAY)\p{Blank}*(.*)""".r
+
+ /**
+ * Converts notational shorthands that are converted to ordinary timestamps.
+ * @param input - a trimmed string
+ * @param zoneId - zone identifier used to get the current date.
+ * @return some of microseconds since the epoch if the conversion completed
+ * successfully otherwise None.
+ */
+ def convertSpecialTimestamp(input: String, zoneId: ZoneId):
Option[SQLTimestamp] = {
Review comment:
What's different from `convertSpecialDate`? I know the output dataType is
different, but the way to handle these special values is different, too?
https://github.com/apache/spark/pull/25708/files#diff-da60f07e1826788aaeb07f295fae4b8aR866
Can we share some code between them?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]