tomvanbussel commented on a change in pull request #30056:
URL: https://github.com/apache/spark/pull/30056#discussion_r507639347
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -2640,6 +2640,20 @@ object SQLConf {
.checkValues(LegacyBehaviorPolicy.values.map(_.toString))
.createWithDefault(LegacyBehaviorPolicy.EXCEPTION.toString)
+ val LEGACY_PARQUET_INT96_REBASE_MODE_IN_WRITE =
+ buildConf("spark.sql.legacy.parquet.int96RebaseModeInWrite")
+ .internal()
+ .doc("When LEGACY, which is the default, Spark will rebase INT96
timestamps from " +
+ "Proleptic Gregorian calendar to the legacy hybrid (Julian +
Gregorian) calendar when " +
+ "writing Parquet files. When CORRECTED, Spark will not do rebase and
write the timestamps" +
+ " as it is. When EXCEPTION, Spark will fail the writing if it sees
ancient timestamps " +
+ "that are ambiguous between the two calendars.")
+ .version("3.1.0")
+ .stringConf
+ .transform(_.toUpperCase(Locale.ROOT))
+ .checkValues(LegacyBehaviorPolicy.values.map(_.toString))
+ .createWithDefault(LegacyBehaviorPolicy.LEGACY.toString)
Review comment:
Could the default be made `LegacyBehaviorPolicy.EXCEPTION` instead?
Could also do this in a follow-up PR if this is controversial.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]