[ https://issues.apache.org/jira/browse/SPARK-26002?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-26002. --------------------------------- Resolution: Fixed Fix Version/s: 3.0.0 Issue resolved by pull request 23000 [https://github.com/apache/spark/pull/23000] > SQL date operators calculates with incorrect dayOfYears for dates before > 1500-03-01 > ----------------------------------------------------------------------------------- > > Key: SPARK-26002 > URL: https://issues.apache.org/jira/browse/SPARK-26002 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.6.3, 2.0.2, 2.1.3, 2.2.0, 2.2.1, 2.2.2, 2.3.0, 2.3.1, > 2.3.2, 2.4.0, 3.0.0 > Reporter: Attila Zsolt Piros > Assignee: Attila Zsolt Piros > Priority: Major > Fix For: 3.0.0 > > > Running the following SQL the result is incorrect: > {noformat} > scala> sql("select dayOfYear('1500-01-02')").show() > +-----------------------------------+ > |dayofyear(CAST(1500-01-02 AS DATE))| > +-----------------------------------+ > | 1| > +-----------------------------------+ > {noformat} > This off by one day is more annoying right at the beginning of a year: > {noformat} > scala> sql("select year('1500-01-01')").show() > +------------------------------+ > |year(CAST(1500-01-01 AS DATE))| > +------------------------------+ > | 1499| > +------------------------------+ > scala> sql("select month('1500-01-01')").show() > +-------------------------------+ > |month(CAST(1500-01-01 AS DATE))| > +-------------------------------+ > | 12| > +-------------------------------+ > scala> sql("select dayOfYear('1500-01-01')").show() > +-----------------------------------+ > |dayofyear(CAST(1500-01-01 AS DATE))| > +-----------------------------------+ > | 365| > +-----------------------------------+ > {noformat} > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org