MaxGekk commented on a change in pull request #30445:
URL: https://github.com/apache/spark/pull/30445#discussion_r527734488



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
##########
@@ -493,6 +493,13 @@ object DateTimeUtils {
     Decimal(getMicroseconds(micros, zoneId), 8, 6)
   }
 
+  /**
+   * Returns the number of seconds since 1970-01-01 00:00:00-00 (can be 
negative).
+   */
+  def getSecondsAfterEpoch(micros: Long, zoneId: ZoneId): Double = {
+    micros.toDouble / MICROS_PER_SECOND

Review comment:
       @gengliangwang If you would like to be compatible to PostgreSQL, you 
need to take the removed implementation:
   ```scala
   /**
      * Returns the number of seconds with fractional part in microsecond 
precision
      * since 1970-01-01 00:00:00 local time.
      */
     def getEpoch(timestamp: SQLTimestamp, zoneId: ZoneId): Decimal = {
       val offset = SECONDS.toMicros(
         zoneId.getRules.getOffset(microsToInstant(timestamp)).getTotalSeconds)
       val sinceEpoch = timestamp + offset
       Decimal(sinceEpoch, 20, 6)
     }
   ```
   PostgreSQL takes seconds since the **local** epoch 1970-01-01 00:00:00-00 
but your implementation calculates seconds since 1970-01-01 00:00:00-00**Z** 
(in UTC time zone).




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to