07ARB commented on a change in pull request #26477: [SPARK-29776][SQL] rpad and
lpad should return NULL when padstring parameter is empty
URL: https://github.com/apache/spark/pull/26477#discussion_r361548262
##########
File path: docs/sql-migration-guide.md
##########
@@ -259,9 +259,10 @@ license: |
- Since Spark 3.0, day-time interval strings are converted to intervals with
respect to the `from` and `to` bounds. If an input string does not match to the
pattern defined by specified bounds, the `ParseException` exception is thrown.
For example, `interval '2 10:20' hour to minute` raises the exception because
the expected format is `[+|-]h[h]:[m]m`. In Spark version 2.4, the `from` bound
was not taken into account, and the `to` bound was used to truncate the
resulted interval. For instance, the day-time interval string from the showed
example is converted to `interval 10 hours 20 minutes`. To restore the behavior
before Spark 3.0, you can set `spark.sql.legacy.fromDayTimeString.enabled` to
`true`.
- Since Spark 3.0, the `date_add` and `date_sub` functions only accepts int,
smallint, tinyint as the 2nd argument, fractional and string types are not
valid anymore, e.g. `date_add(cast('1964-05-23' as date), '12.34')` will cause
`AnalysisException`. In Spark version 2.4 and earlier, if the 2nd argument is
fractional or string value, it will be coerced to int value, and the result
will be a date value of `1964-06-04`.
-
+
Review comment:
ok
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]