MaxGekk commented on a change in pull request #28137: [SPARK-31361][SQL] Rebase 
datetime in parquet/avro according to file written Spark version
URL: https://github.com/apache/spark/pull/28137#discussion_r404299889
 
 

 ##########
 File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceUtils.scala
 ##########
 @@ -64,4 +66,10 @@ object DataSourceUtils {
 
   private[sql] def isDataFile(fileName: String) =
     !(fileName.startsWith("_") || fileName.startsWith("."))
+
+  def needRebaseDateTime(lookupFileMeta: String => String): Option[Boolean] = {
+    // If there is no version, we return None and let the caller side to 
decide.
+    // Files written by Spark 3.0 and later follow the new calendar and don't 
need to rebase.
+    Option(lookupFileMeta(SPARK_VERSION_METADATA_KEY)).map(_ < "3.0")
 
 Review comment:
   > Files written by Spark 3.0 and later follow the new calendar and don't 
need to rebase.
   
   This is an unexpected solution for me which doesn't cover some use case from 
my point of view. I do believe versions in parquet/avro are not enough to make 
decision about rebasing. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to