attilapiros commented on a change in pull request #24181: [SPARK-27242][SQL]
Make formatting TIMESTAMP/DATE literals independent from the default time zone
URL: https://github.com/apache/spark/pull/24181#discussion_r268155733
##########
File path: docs/sql-migration-guide-upgrade.md
##########
@@ -96,13 +96,17 @@ displayTitle: Spark SQL Upgrading Guide
- The `weekofyear`, `weekday`, `dayofweek`, `date_trunc`,
`from_utc_timestamp`, `to_utc_timestamp`, and `unix_timestamp` functions use
java.time API for calculation week number of year, day number of week as well
for conversion from/to TimestampType values in UTC time zone.
- the JDBC options `lowerBound` and `upperBound` are converted to
TimestampType/DateType values in the same way as casting strings to
TimestampType/DateType values. The conversion is based on Proleptic Gregorian
calendar, and time zone defined by the SQL config `spark.sql.session.timeZone`.
In Spark version 2.4 and earlier, the conversion is based on the hybrid
calendar (Julian + Gregorian) and on default system time zone.
+
+ - Formatting of `TIMESTAMP` and `DATE` literals.
Review comment:
Ok, thanks!
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]