MaxGekk commented on a change in pull request #31564:
URL: https://github.com/apache/spark/pull/31564#discussion_r576624487



##########
File path: docs/sql-data-sources-parquet.md
##########
@@ -329,4 +365,54 @@ Configuration of Parquet can be done using the `setConf` 
method on `SparkSession
   </td>
   <td>1.6.0</td>
 </tr>
+<tr>
+<td>spark.sql.legacy.parquet.datetimeRebaseModeInRead</td>
+  <td><code>EXCEPTION</code></td>
+  <td>The rebasing mode for the values of the <code>DATE</code>, 
<code>TIMESTAMP_MILLIS</code>, <code>TIMESTAMP_MICROS</code> logical types from 
the Julian to Proleptic Gregorian calendar:<br>
+    <ul>
+      <li><code>EXCEPTION</code>: Spark will fail the reading if it sees 
ancient dates/timestamps that are ambiguous between the two calendars.</li>
+      <li><code>CORRECTED</code>: Spark will not do rebase and read the 
dates/timestamps as it is.</li>
+      <li><code>LEGACY</code>: Spark will rebase dates/timestamps from the 
legacy hybrid (Julian + Gregorian) calendar to Proleptic Gregorian calendar 
when reading Parquet files.</li>
+    </ul>
+    This config is only effective if the writer info (like Spark, Hive) of the 
Parquet files is unknown.
+  </td>
+  <td>3.0.0</td>
+</tr>
+<tr>
+  <td>spark.sql.legacy.parquet.datetimeRebaseModeInWrite</td>

Review comment:
       Regarding to mention of the rebasing SQL configs in the Spark SQL guide. 
I see at least two options:
   1. Remove `.internal()` as Hyukjin proposed
   2. Not document them at all - just document DS options
   3. The approach of the current PR: document them and leave as internal(). I 
do believe we should document those configs since we mention them in spark 
upgrade exceptions.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to