MaxGekk commented on a change in pull request #27438: [MINOR][SQL][DOCS][2.4] 
Fix the descriptions of `to_timestamp` and `ParseToTimestamp`
URL: https://github.com/apache/spark/pull/27438#discussion_r373962994
 
 

 ##########
 File path: sql/core/src/main/scala/org/apache/spark/sql/functions.scala
 ##########
 @@ -2949,10 +2949,10 @@ object functions {
   def unix_timestamp(s: Column, p: String): Column = withExpr { 
UnixTimestamp(s.expr, Literal(p)) }
 
   /**
-   * Converts to a timestamp by casting rules to `TimestampType`.
+   * Converts to a timestamp by casting rules to `TimestampType` in the 
seconds precision.
    *
    * @param s A date, timestamp or string. If a string, the data must be in a 
format that can be
-   *          cast to a timestamp, such as `yyyy-MM-dd` or `yyyy-MM-dd 
HH:mm:ss.SSSS`
+   *          cast to a timestamp, such as `yyyy-MM-dd` or `yyyy-MM-dd 
HH:mm:ss`
 
 Review comment:
   Since SimpleDateFormat/FastDateFormat cannot parsing in microseconds 
precision, I would propose to switch `to_timestamp` on the parser introduced by 
https://github.com/apache/spark/pull/26507 and port back 
https://github.com/apache/spark/pull/24420 to 2.4. At the moment, the behavior 
of `to_timestamp` doesn't seem consistent. /cc @cloud-fan WDYT?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to