MaxGekk opened a new pull request #31562:
URL: https://github.com/apache/spark/pull/31562


   ### What changes were proposed in this pull request?
   Mention the DS options introduced by 
https://github.com/apache/spark/pull/31529 and by 
https://github.com/apache/spark/pull/31489 in `SparkUpgradeException`.
   
   ### Why are the changes needed?
   To improve user experience with Spark SQL. Before the changes, the error 
message recommends to set SQL configs but the configs cannot help in the some 
situations (see the PRs for more details).
   
   ### Does this PR introduce _any_ user-facing change?
   Yes. After the changes, the error message is:
   
   _org.apache.spark.SparkUpgradeException: You may get a different result due 
to the upgrading of Spark 3.0: reading dates before 1582-10-15 or timestamps 
before 1900-01-01T00:00:00Z from Parquet files can be ambiguous, as the files 
may be written by Spark 2.x or legacy versions of Hive, which uses a legacy 
hybrid calendar that is different from Spark 3.0+'s Proleptic Gregorian 
calendar. See more details in SPARK-31404. You can set the SQL config 
'spark.sql.legacy.parquet.datetimeRebaseModeInRead' or the datasource option 
'datetimeRebaseMode' to 'LEGACY' to rebase the datetime values w.r.t. the 
calendar difference during reading. To read the datetime values as it is, set 
the SQL config 'spark.sql.legacy.parquet.datetimeRebaseModeInRead' or the 
datasource option 'datetimeRebaseMode' to 'CORRECTED'._
   
   
   ### How was this patch tested?
   1. By checking coding style: `./dev/scalastyle`
   2. By running the related test suite:
   ```
   $ build/sbt -Phive-2.3 -Phive-thriftserver "test:testOnly 
*ParquetRebaseDatetimeV1Suite"
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to