EnricoMi commented on pull request #27805:
URL: https://github.com/apache/spark/pull/27805#issuecomment-637441131


   The division does not go against the contract, it is only defined for cases 
where both intervals have the same single resolution (either months, days or 
micros), it returns null on mixtures, see 
https://github.com/apache/spark/pull/27805/files#diff-eba257f41b49f470321579875f054f00R540.
   
   The `extract` and `date_part` functions cannot be used to evaluate the 
magnitude of an interval:
   
   ```
   val data = Seq((Timestamp.valueOf("2020-02-01 12:00:00"), 
Timestamp.valueOf("2020-02-01 13:30:25"))).toDF("start", "end")
   
   data.show
   +-------------------+-------------------+
   |              start|                end|
   +-------------------+-------------------+
   |2020-02-01 12:00:00|2020-02-01 13:30:25|
   +-------------------+-------------------+
   
   data.select(expr("date_part('minute', end - start)"), 
expr("date_part('hour', end - start)")).show
   
+-------------------------------------------------------+-----------------------------------------------------+
   |date_part('minute', subtracttimestamps(`end`, `start`))|date_part('hour', 
subtracttimestamps(`end`, `start`))|
   
+-------------------------------------------------------+-----------------------------------------------------+
   |                                                     30|                    
                                1|
   
+-------------------------------------------------------+-----------------------------------------------------+
   
   ```
   
   Neither `date_part('minute', ...)` nor `date_part('hour', ..)` provides me 
with an accurate magnitude (either 90 minutes or 1.5 hours) of the interval.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to