TJX2014 opened a new pull request #28534: URL: https://github.com/apache/spark/pull/28534
# What changes were proposed in this pull request? Make long type time convert to microsecond. A unit test is added. ### Why are the changes needed? In spark sql,when convert 1586318188000 to timestamp, the result 52238-06-04 13:06:400.0, while the correct is 2020-04-08 11:56:28,because spark consider the input is second, thus multiply 1000 000 to convert to us; while in hive, multiply 1000 only. ### Does this PR introduce _any_ user-facing change? Yes, people will use millsecond to convert to timestamp instead of second currently. ### How was this patch tested? Unit test. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
