Github user mgaido91 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21196#discussion_r185106149
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
 ---
    @@ -45,6 +45,7 @@ object DateTimeUtils {
       // it's 2440587.5, rounding up to compatible with Hive
       final val JULIAN_DAY_OF_EPOCH = 2440588
       final val SECONDS_PER_DAY = 60 * 60 * 24L
    +  final val SECONDS_PER_MONTH = 60 * 60 * 24 * 31D
    --- End diff --
    
    hi @dongjoon-hyun! thanks for the comment. Of course it is not, strictly 
speaking, but this is how both Spark before the PR (notice the 31.0) and Hive 
(which Spark draws his inspiration from for this function) work. They consider 
every month as having 31 days, even though this is not true of course.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to