sarutak commented on a change in pull request #32949:
URL: https://github.com/apache/spark/pull/32949#discussion_r664963511



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
##########
@@ -2526,10 +2526,10 @@ object DatePart {
        224
       > SELECT _FUNC_('SECONDS', timestamp'2019-10-01 00:00:01.000001');
        1.000001
-      > SELECT _FUNC_('days', interval 1 year 10 months 5 days);
+      > SELECT _FUNC_('days', interval 5 days 3 hours 7 minutes);
        5
-      > SELECT _FUNC_('seconds', interval 5 hours 30 seconds 1 milliseconds 1 
microseconds);

Review comment:
       Ah, I guess you mean `milliseconds` and `microseconds` aren't ANSI 
supported fields but it's acceptable expansion to mix valid ANSI day-time 
fields and non-ANSI ones (`WEEK`, `MILLISECOND` and `MICROSECOND`) right?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to