MaxGekk commented on a change in pull request #25153: [SPARK-28389][SQL] Use 
Java 8 API in add_months
URL: https://github.com/apache/spark/pull/25153#discussion_r303259002
 
 

 ##########
 File path: docs/sql-migration-guide-upgrade.md
 ##########
 @@ -151,6 +151,8 @@ license: |
 
   - Since Spark 3.0, substitution order of nested WITH clauses is changed and 
an inner CTE definition takes precedence over an outer. In version 2.4 and 
earlier, `WITH t AS (SELECT 1), t2 AS (WITH t AS (SELECT 2) SELECT * FROM t) 
SELECT * FROM t2` returns `1` while in version 3.0 it returns `2`. The previous 
behaviour can be restored by setting `spark.sql.legacy.ctePrecedence.enabled` 
to `true`.
 
+  - Since Spark 3.0, the `add_months` function does not adjust the resulting 
date to a last day of month if the original date is a last day of month. The 
resulting date is adjust to a last day of month only if it is invalid. For 
example, `select add_months(DATE'2019-02-28', 1)` produces `2019-03-28` but 
`select add_months(DATE'2019-01-31', 1)` produces `2019-02-28`.
 
 Review comment:
   > So previously, adding a month to 2019-02-28 resulted in 2019-03-31?
   
   Yes, it does:
   ```
   scala> spark.sql("select add_months(DATE'2019-02-28', 1)").show
   +--------------------------------+
   |add_months(DATE '2019-02-28', 1)|
   +--------------------------------+
   |                      2019-03-31|
   +--------------------------------+
   ```
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to