MaxGekk commented on issue #25981: [SPARK-28420][SQL] Support the `INTERVAL` type in `date_part()` URL: https://github.com/apache/spark/pull/25981#issuecomment-540047745 > why in the implementation is the result of getSeconds and smaller a Decimal, PostgreSQL has the `double precision` type: ```sql maxim=# select pg_typeof(date_part('second', interval '1 second 1 microsecond')); pg_typeof ------------------ double precision (1 row) ``` but I implemented decimal type to avoid floating point errors like loosing precision fixed in https://github.com/apache/spark/pull/25421 > but the result of getMinutes and larger is an integer type? How did you get the integer type? This has the `BYTE` type: ``` scala> spark.sql("select date_part('minute', interval 1 day 1 minute)").printSchema root |-- date_part('minute', interval 1 days 1 minutes): byte (nullable = false) ```
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
