cloud-fan commented on a change in pull request #32949:
URL: https://github.com/apache/spark/pull/32949#discussion_r667060682



##########
File path: sql/core/src/test/resources/sql-tests/results/ansi/interval.sql.out
##########
@@ -13,9 +13,31 @@ struct<((TIMESTAMP '2019-10-15 10:11:12.001002' - DATE 
'2019-10-15') * 3):interv
 -- !query
 select interval 4 month 2 weeks 3 microseconds * 1.5
 -- !query schema
-struct<multiply_interval(INTERVAL '4 months 14 days 0.000003 seconds', 
1.5):interval>
+struct<>
 -- !query output
-6 months 21 days 0.000005 seconds
+org.apache.spark.sql.catalyst.parser.ParseException
+
+Cannot mix year-month and day-time fields: 4 month 2 weeks 3 microseconds(line 
1, pos 16)
+
+== SQL ==
+select interval 4 month 2 weeks 3 microseconds * 1.5
+----------------^^^
+
+
+-- !query
+select interval 2 years 4 months
+-- !query schema
+struct<INTERVAL '2-4' YEAR TO MONTH:interval year to month>
+-- !query output
+2-4
+
+
+-- !query
+select interval 2 weeks 3 microseconds * 1.5
+-- !query schema
+struct<(INTERVAL '14 00:00:00.000003' DAY TO SECOND * 1.5):interval day to 
second>
+-- !query output
+21 00:00:00.000005000

Review comment:
       hmm, so we always print 9 digits for second even if our precision is 
only microsecond?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to