GuoPhilipse opened a new pull request #28568:
URL: https://github.com/apache/spark/pull/28568


   **What changes were proposed in this pull request?**
   As we know,long datatype is interpreted as milliseconds when conversion to 
timestamp in hive, while long is interpreted as seconds when conversion to 
timestamp in spark, we have been facing error data during migrating hive sql to 
spark sql. with compatibility flag we can fix this error,
   
   **Why are the changes needed?**
   we have many sqls runing in product, so we need a compatibility flag to make 
them migrating smoothly ,meanwhile do not change the user behavior in spark.
   
   **Does this PR introduce any user-facing change?**
   if user use this patch ,then user should set this paramter ,
   if not, user do not need to do anything.
   
   **How was this patch tested?**
   unit test added


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to