[ https://issues.apache.org/jira/browse/SPARK-23241?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16341593#comment-16341593 ]
Luke R Hospadaruk commented on SPARK-23241: ------------------------------------------- Not sure if I prioritized this correctly? > from_unixtime SQL function returning incorrect dates > ---------------------------------------------------- > > Key: SPARK-23241 > URL: https://issues.apache.org/jira/browse/SPARK-23241 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.2.0 > Environment: Running spark 2.2.0 on AWS EMR release version 5.10.0 > Have observed the same problem through a zeppelin console and a spark app > executing SQL strings. > Reporter: Luke R Hospadaruk > Priority: Minor > > Noticed recently that the spark SQL from_datetime function appears to be > returning the formatted dates with the wrong year for unix timestamps on > 2017-12-31 (UTC). > {code:sql} > select > --this is right. timestamp is 2017-12-30 23:59:59 > from_unixtime(1514678399, 'YYYY-MM-dd HH:mm:ss') as ok, > --this should be 2017-12-31 00:00:00, but is in fact it's 2018-12-31 00:00:00 > from_unixtime(1514678399+1, 'YYYY-MM-dd HH:mm:ss') as wrong, > --this should be 2017-12-31 12:00:00 but is in fact 2018-12-31 12:00:00 > from_unixtime(1514678399+1+12*60*60, 'YYYY-MM-dd HH:mm:ss') as also_wrong, > --this is right - midnight 2018-01-01 > from_unixtime(1514678399+1+24*60*60, 'YYYY-MM-dd HH:mm:ss') as ok_again > {code} > returns: > {code} > ok | wrong | also_wrong | ok_again > 2017-12-30 23:59:59 | 2018-12-31 00:00:00 | 2018-12-31 12:00:00 | 2018-01-01 > 00:00:00 > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org