[ https://issues.apache.org/jira/browse/SPARK-31654?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17102712#comment-17102712 ]
Ramesh commented on SPARK-31654: -------------------------------- [~roman_y] , [~Ankitraj] It is working as expected .. spark.sql("SELECT sequence(to_date('2018-01-01'), to_date('2019-01-01'), interval 1 month)").rdd.collect() spark.sql("SELECT sequence(to_date('2018-01-01'), to_date('2019-01-01'), interval 1 month)").rdd.collect() [Row(sequence(to_date('2018-01-01'), to_date('2019-01-01'), INTERVAL '1 months')=[datetime.date(2018, 1, 1), datetime.date(2018, 2, 1), datetime.date(2018, 3, 1), datetime.date(2018, 4, 1), datetime.date(2018, 5, 1), datetime.date(2018, 6, 1), datetime.date(2018, 7, 1), datetime.date(2018, 8, 1), datetime.date(2018, 9, 1), datetime.date(2018, 10, 1), datetime.date(2018, 11, 1), datetime.date(2018, 12, 1), datetime.date(2019, 1, 1)])] > sequence producing inconsistent intervals for month step > -------------------------------------------------------- > > Key: SPARK-31654 > URL: https://issues.apache.org/jira/browse/SPARK-31654 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.4.4 > Reporter: Roman Yalki > Priority: Major > > Taking an example from [https://spark.apache.org/docs/latest/api/sql/] > {code:java} > > SELECT sequence(to_date('2018-01-01'), to_date('2018-03-01'), interval 1 > > month);{code} > [2018-01-01,2018-02-01,2018-03-01] > if one is to expand `stop` till the end of the year some intervals are > returned as the last day of the month whereas first day of the month is > expected > {code:java} > > SELECT sequence(to_date('2018-01-01'), to_date('2019-01-01'), interval 1 > > month){code} > [2018-01-01, 2018-02-01, 2018-03-01, *2018-03-31, 2018-04-30, 2018-05-31, > 2018-06-30, 2018-07-31, 2018-08-31, 2018-09-30, 2018-10-31*, 2018-12-01, > 2019-01-01] > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org