MaxGekk opened a new pull request #28729: URL: https://github.com/apache/spark/pull/28729
### What changes were proposed in this pull request? Set `spark.sql.datetime.java8API.enabled` to `true` in: 1. `SparkSQLEnv.init()` of Thrift server, and 2. `SparkOperation. withLocalProperties()` ### Why are the changes needed? 1. Date and timestamp string literals are parsed by using Java 8 time API and Spark's session time zone. Before the changes, date/timestamp values were collected as legacy types `java.sql.Date`/`java.sql.Timestamp`, and the value of such types didn't respect the config `spark.sql.session.timeZone`. To have consistent view, users had to keep JVM time zone and Spark's session time zone in sync. 2. After the changes, formatting of date values doesn't depend on JVM time zone. 3. While returning dates/timestamps of Java 8 type, we can avoid dates/timestamps rebasing from Proleptic Gregorian calendar to the hybrid calendar (Julian + Gregorian), and the issues related to calendar switching. 4. Properly handle negative years (BCE). 5. Consistent conversion of date/timestamp strings to/from internal Catalyst types in both direction to and from Spark. ### Does this PR introduce any user-facing change? Yes. Before: ```sql spark-sql> select make_date(-44, 3, 15); 0045-03-15 ``` After: ```sql spark-sql> select make_date(-44, 3, 15); -0044-03-15 ``` ### How was this patch tested? Manually via `bin/spark-sql`. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org