yaooqinn commented on pull request #27805:
URL: https://github.com/apache/spark/pull/27805#issuecomment-640624624
```sql
scala> spark.udf.register("div", (x: CalendarInterval, y: CalendarInterval)
=> x.microseconds / y.microseconds)
res5: org.apache.spark.sql.expressions.UserDefin
yaooqinn commented on pull request #27805:
URL: https://github.com/apache/spark/pull/27805#issuecomment-637593252
You can use both
java API
`org.apache.spark.unsafe.types.CalendarInterval.extractAsDuration().toSeconds`
and SQL functions
`date_part('hour', ...) * 3600 + date_pa
yaooqinn commented on pull request #27805:
URL: https://github.com/apache/spark/pull/27805#issuecomment-635741463
thanks for pinging me @maropu
Using extract/date_part functions instead may be a good choice here for
end-users. Supporting `/` for intervals goes against a contract tha