[ 
https://issues.apache.org/jira/browse/SPARK-46440?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Max Gekk resolved SPARK-46440.
------------------------------
    Fix Version/s: 4.0.0
       Resolution: Fixed

Issue resolved by pull request 44388
[https://github.com/apache/spark/pull/44388]

> Set the rebase configs to the CORRECTED mode by default
> -------------------------------------------------------
>
>                 Key: SPARK-46440
>                 URL: https://issues.apache.org/jira/browse/SPARK-46440
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 4.0.0
>            Reporter: Max Gekk
>            Assignee: Max Gekk
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 4.0.0
>
>
> Set all rebase related SQL configs to the `CORRECTED` mode by default. Here 
> are the affected configs:
> - spark.sql.parquet.int96RebaseModeInWrite
> - spark.sql.parquet.datetimeRebaseModeInWrite
> - spark.sql.parquet.int96RebaseModeInRead
> - spark.sql.parquet.datetimeRebaseModeInRead
> - spark.sql.avro.datetimeRebaseModeInWrite
> - spark.sql.avro.datetimeRebaseModeInRead
> The configs were set to the `EXCEPTION` mode to give users a choice to select 
> proper mode for compatibility with old Spark versions <= 2.4.5. Those 
> versions are not able to detect the rebase mode from meta information in 
> parquet and avro files. Since the versions are out of broad usage, Spark 
> starting from the version 4.0.0 will write/ read ancient datatime without 
> rebasing and any exceptions. This should be more convenient for users.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to