zero323 edited a comment on pull request #29935:
URL: https://github.com/apache/spark/pull/29935#issuecomment-798601773


   > Yeah, it would be nice if we map it to timedelta.
   
   My biggest concern is that Spark implementation doesn't really map to 
`timedelta`. Let's say `INTERVAL 2 years` or `INTERVAL 1 month` ‒ we could 
derive some arbitrary rules for handling intervals expressed with units larger 
than weeks, but I am not sure how useful these will be in practice.
   
   Ultimately I'd like following to be satisfied
   
   ```python
   cd, i, yfn = spark.sql("""
       SELECT *, cd + i AS yfn FROM (SELECT current_date() as cd, INTERVAL 1 
year AS i) t
   """).first()
   
   assert cd + i == yfn
   ```
   
   for arbitrary `Interval` `i` which, if I am not missing anything here, won't 
be possible with `timedelta`.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to