srowen commented on issue #25981: [SPARK-28420][SQL] Support the `INTERVAL` type in `date_part()` URL: https://github.com/apache/spark/pull/25981#issuecomment-540015202 OK, here's another way of asking: why in the implementation is the result of `getSeconds` and smaller a Decimal, but the result of `getMinutes` and larger is an integer type? If this is simply "because that's how Spark handles date parts of timestamps already" or "that's what PostgreSQL" does then at least there is some consistency there, and think this could be OK. I can't find any good reference for this, other than PostgreSQL examples. Is it specific to that DB? I see other DBs support `EXTRACT` for this.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
