srowen commented on a change in pull request #25981: [SPARK-28420][SQL] Support 
the `INTERVAL` type in `date_part()`
URL: https://github.com/apache/spark/pull/25981#discussion_r332174457
 
 

 ##########
 File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
 ##########
 @@ -2067,6 +2082,10 @@ object DatePart {
        224
       > SELECT _FUNC_('SECONDS', timestamp'2019-10-01 00:00:01.000001');
        1.000001
+      > SELECT _FUNC_('days', interval 1 year 10 months 5 days);
 
 Review comment:
   Sort of answering my own question. From PostgreSQL, at least:
   
https://www.postgresql.org/docs/9.1/functions-datetime.html#FUNCTIONS-DATETIME-EXTRACT
   
   ```
   SELECT date_part('day', TIMESTAMP '2001-02-16 20:38:40');
   Result: 16
   
   SELECT date_part('hour', INTERVAL '4 hours 3 minutes');
   Result: 4
   ```
   
   It seems like the answer to the second example here should be 30?
   
   I'm getting off on a tangent, but, can you specify "interval 5 minutes 90 
seconds"? if so, what's the minute part -- 5 or 6? if you can't specify that, 
can you specify "interval 90 seconds"? if not why not?
   
   Just getting confused about the intended semantics of the date part of an 
interval!

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to