MaxGekk commented on a change in pull request #28754:
URL: https://github.com/apache/spark/pull/28754#discussion_r436701067



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
##########
@@ -420,6 +420,10 @@ object DateTimeUtils {
     Instant.ofEpochSecond(secs, mos * NANOS_PER_MICROS)
   }
 
+  def daysToInstant(daysSinceEpoch: SQLDate): Instant = {
+    Instant.ofEpochSecond(daysSinceEpoch * SECONDS_PER_DAY)

Review comment:
       You assume `daysSinceEpoch ` in UTC here which is incorrect, in general 
case.

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
##########
@@ -420,6 +420,10 @@ object DateTimeUtils {
     Instant.ofEpochSecond(secs, mos * NANOS_PER_MICROS)
   }
 
+  def daysToInstant(daysSinceEpoch: SQLDate): Instant = {
+    Instant.ofEpochSecond(daysSinceEpoch * SECONDS_PER_DAY)

Review comment:
       `daysSinceEpoch` is "local" days. For example: 1 day is 1970-01-02, and 
in
   - UTC is `Instant.ofEpochSecond(1 * SECONDS_PER_DAY)`
   - `Europe/Amsterdam`
   ```scala
       assert(LocalDateTime.of(
         LocalDate.of(1970, 1, 2),
         LocalTime.MIDNIGHT)
         .atZone(ZoneId.of("Europe/Amsterdam"))
         .toInstant.toEpochMilli / MILLIS_PER_HOUR === 
SECONDS_PER_DAY/SECONDS_PER_HOUR)
   23 did not equal 24
   ScalaTestFailureLocation: 
org.apache.spark.sql.catalyst.util.DateTimeUtilsSuite at 
(DateTimeUtilsSuite.scala:664)
   Expected :24
   Actual   :23
   ```
   The difference in 1 hour due to days are local.

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
##########
@@ -420,6 +420,10 @@ object DateTimeUtils {
     Instant.ofEpochSecond(secs, mos * NANOS_PER_MICROS)
   }
 
+  def daysToInstant(daysSinceEpoch: SQLDate): Instant = {
+    Instant.ofEpochSecond(daysSinceEpoch * SECONDS_PER_DAY)

Review comment:
       Catalyst's DATE type stores the number of days since 1970-01-01 
independently from any time zone (NOT in UTC). 

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
##########
@@ -420,6 +420,10 @@ object DateTimeUtils {
     Instant.ofEpochSecond(secs, mos * NANOS_PER_MICROS)
   }
 
+  def daysToInstant(daysSinceEpoch: SQLDate): Instant = {
+    Instant.ofEpochSecond(daysSinceEpoch * SECONDS_PER_DAY)

Review comment:
       I do believe it is not needed. To calculate average number of days, you 
can just do `Math.floorDiv(sum, count)`.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to