HyukjinKwon commented on a change in pull request #25224: 
[SPARK-28321][DOCS][FOLLOW-UP] Update migration guide by 0-args Java UDF's 
internal behaviour change
URL: https://github.com/apache/spark/pull/25224#discussion_r305671844
 
 

 ##########
 File path: docs/sql-migration-guide-upgrade.md
 ##########
 @@ -151,7 +151,9 @@ license: |
 
   - Since Spark 3.0, substitution order of nested WITH clauses is changed and 
an inner CTE definition takes precedence over an outer. In version 2.4 and 
earlier, `WITH t AS (SELECT 1), t2 AS (WITH t AS (SELECT 2) SELECT * FROM t) 
SELECT * FROM t2` returns `1` while in version 3.0 it returns `2`. The previous 
behaviour can be restored by setting `spark.sql.legacy.ctePrecedence.enabled` 
to `true`.
 
-  - Since Spark 3.0, the `add_months` function does not adjust the resulting 
date to a last day of month if the original date is a last day of months. For 
example, `select add_months(DATE'2019-02-28', 1)` results `2019-03-28`. In 
Spark version 2.4 and earlier, the resulting date is adjusted when the original 
date is a last day of months. For example, adding a month to `2019-02-28` 
resultes in `2019-03-31`.
+  - Since Spark 3.0, the `add_months` function does not adjust the resulting 
date to a last day of month if the original date is a last day of months. For 
example, `select add_months(DATE'2019-02-28', 1)` results `2019-03-28`. In 
Spark version 2.4 and earlier, the resulting date is adjusted when the original 
date is a last day of months. For example, adding a month to `2019-02-28` 
results in `2019-03-31`.
 
 Review comment:
   Just a typo fix from `resultes` -> `results`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to