nchammas commented on a change in pull request #29935:
URL: https://github.com/apache/spark/pull/29935#discussion_r562180659



##########
File path: python/pyspark/sql/types.py
##########
@@ -186,6 +186,30 @@ def fromInternal(self, ts):
             return datetime.datetime.fromtimestamp(ts // 
1000000).replace(microsecond=ts % 1000000)
 
 
+class CalendarIntervalType(DataType, metaclass=DataTypeSingleton):

Review comment:
       Doesn't @zero323's example from the PR description show that Spark 
already exposes this type? 
   
   ```python
   spark.sql("SELECT current_date() - current_date()")
   ```
   
   For the record, btw, Postgres supports [an `interval` 
type](https://www.postgresql.org/docs/current/datatype-datetime.html) and done 
so since at least [version 
7.1](https://www.postgresql.org/docs/7.1/datatype-datetime.html), which was 
released in 2001. (I mention this since Postgres often comes up as a reference 
for whether Spark SQL should support a feature or not.)




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to