cloud-fan commented on pull request #32001:
URL: https://github.com/apache/spark/pull/32001#issuecomment-810301325


   In the SQL standard, interval type has a precision, so `CAST(col TO INTERVAL 
DAY TO MINUTE)` is not ambiguous and the value is treated as the number of 
minutes.
   
   However, Spark will not add precision to interval types in the near future, 
and `$"col".cast(DayTimeIntervalType)` is ambiguous as it's unclear how to 
interpret the input value. Seconds or microseconds or even minutes?
   
   The same problem applies to casting to timestamp as well. As a result, we 
forbid casting number to timestamp under ANSI mode, and provides explicit 
functions to do so.
   
   How useful it is to support casting from number to interval? We already have 
the `make_interval` function and we can create intervals from string values. It 
doesn't seem to provide much value supporting it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to