MaxGekk commented on a change in pull request #24342: [SPARK-27438][SQL] Parse 
strings with timestamps by to_timestamp() in microsecond precision
URL: https://github.com/apache/spark/pull/24342#discussion_r276195060
 
 

 ##########
 File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala
 ##########
 @@ -203,7 +203,11 @@ object Cast {
       > SELECT _FUNC_('10' as int);
        10
   """)
-case class Cast(child: Expression, dataType: DataType, timeZoneId: 
Option[String] = None)
+case class Cast(
+    child: Expression,
+    dataType: DataType,
+    timeZoneId: Option[String] = None,
+    timestampScaleFactor: Long = MICROS_PER_SECOND)
 
 Review comment:
   `ParseToTimestamp` extends `RuntimeReplaceable`,  and the optimizer replaces 
it by `Cast(UnixTimestamp)`. Not sure how it is important.
   
   Frankly speaking, I would add new expression (and register a function) which 
just takes a string and a pattern, and parse it to `TimestampType` in 
microsecond precision. We have `DateTimeUtils.stringToTimestamp` but it doesn't 
accept a pattern.
   
   Surprisingly, parsing of a string with specific pattern is possible by using 
`from_csv` now which not strait-forward method of doing this. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to