MaxGekk commented on issue #24311: [SPARK-27401][SQL] Refactoring conversion of 
Timestamp to/from java.sql.Timestamp
URL: https://github.com/apache/spark/pull/24311#issuecomment-480565837
 
 
   > it because it seems to be not inside both Apache Spark and Apache ORC.
   > - OrcHadoopFsRelationSuite passed.
   > - HiveOrcHadoopFsRelationSuite failed.
   
   Yeh, it seems there is a calendar incompatibility issue somewhere inside 
Hive + ORC which could cause the difference `![8,0647-07-01]               
[8,0647-06-28]` for old days before 1582 (I guess). I just thought we switched 
to Proleptic Gregorian calendar everywhere in Spark 
(https://issues.apache.org/jira/browse/SPARK-26651).
   
   In any case, I think my date (not timestamp) related changes are potentially 
more expensive comparing to current implementation because of conversion of 
`java.sql.Date` -> `java.time.LocalDate`. The conversion extracts components 
like `year`, `month` and etc from `java.sql.Date` that are not cheap but 
current implementation does time zone shifting which is not necessary too. In 
any case, I am going to leave Java <-> Catalyst's date conversions as is for 
now. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to