[GitHub] [spark] yaooqinn commented on pull request #27805: [SPARK-31056][SQL] Add CalendarIntervals division

2020-06-09 Thread GitBox
yaooqinn commented on pull request #27805: URL: https://github.com/apache/spark/pull/27805#issuecomment-640624624 ```sql scala> spark.udf.register("div", (x: CalendarInterval, y: CalendarInterval) => x.microseconds / y.microseconds) res5: org.apache.spark.sql.expressions.UserDefin

[GitHub] [spark] yaooqinn commented on pull request #27805: [SPARK-31056][SQL] Add CalendarIntervals division

2020-06-02 Thread GitBox
yaooqinn commented on pull request #27805: URL: https://github.com/apache/spark/pull/27805#issuecomment-637593252 You can use both java API `org.apache.spark.unsafe.types.CalendarInterval.extractAsDuration().toSeconds` and SQL functions `date_part('hour', ...) * 3600 + date_pa

[GitHub] [spark] yaooqinn commented on pull request #27805: [SPARK-31056][SQL] Add CalendarIntervals division

2020-05-28 Thread GitBox
yaooqinn commented on pull request #27805: URL: https://github.com/apache/spark/pull/27805#issuecomment-635741463 thanks for pinging me @maropu Using extract/date_part functions instead may be a good choice here for end-users. Supporting `/` for intervals goes against a contract tha