Github user holdenk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14398#discussion_r72878822
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala
 ---
    @@ -875,11 +877,11 @@ object DateTimeUtils {
             val hh = seconds / 3600
             val mm = seconds / 60 % 60
             val ss = seconds % 60
    -        val nano = millisOfDay % 1000 * 1000000
    +        val calendar = Calendar.getInstance(tz)
     
    -        // create a Timestamp to get the unix timestamp (in UTC)
    -        val timestamp = new Timestamp(year - 1900, month - 1, day, hh, mm, 
ss, nano)
    -        guess = (millisLocal - timestamp.getTime).toInt
    +        calendar.set(year, month, day, hh, mm, ss)
    --- End diff --
    
    ok so my (previous) comments don't make sense in response to the comment 
edit but I'll leave as is.
    
    As for the newly updated comment, the approach does avoid the deprecation 
warning while keeping the existing functionality - but:
    1) Its going to throw away any subsecond information in the timestamp by 
including it in the difference from millis local to the time in miliseconds
    2) It doesn't seem to resolve the timezone offset correctly (it seems to 
give back non-DST times in DST transitions)*
    
    That being said - it is a strict improvement over the previous code so we 
could go for this approach instead since it is only a stop-gap solution and its 
really about choosing which way we want this code to be more broken in.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to