TJX2014 edited a comment on pull request #29043:
URL: https://github.com/apache/spark/pull/29043#issuecomment-657026639


   > Yeah DATETIME is definitely not the same thing as TIMESTAMP. You need to 
start with a Spark date-time type, logically.
   
   Hi, @srowen 
   Thank you for your response.
    The range of date-time type in spark is [0001-01-01, 9999-12-31], which 
ignores hour, min, sec...,is it what you mean ?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to