zero323 commented on pull request #29935:
URL: https://github.com/apache/spark/pull/29935#issuecomment-803946095


   Thank you for your input @MaxGekk!
   
   > I would like to propose to not expose `CalendarIntervalType` (consider it 
as a legacy one), and focus on new types `YearMonthIntervalType` and 
`DayTimeIntervalType` (see 
[SPARK-27790](https://issues.apache.org/jira/browse/SPARK-27790)).
   
   In my opinion SPARK-27790 would definitely mark  SPARK-33056 obsolete, not 
that we had very good proposals for an useful implementation. However, I am not 
sure if this really affects SPARK-33055. I'd argue it is not so much a new 
feature (something new is exported) as a bugfix (a component, that was 
accidentally omitted is included) ‒ I can ran queries  included in the JIRA in 
Scala, Java, SparkR and even some 3rd party bindings ‒ even if legacy there are 
supported. 
   
   You can even add cast
   
   ```python
   spark.sql("SELECT CAST(current_date() - current_date() AS string)")
   ```
   
   and PySpark won't see a problem.
   
   Unhandled exception in such case is just not good.
   
   Even if  `CalendarIntervalType` I'd still consider starting a discussion 
about backporting this minimal fix to 3.0 and 3.1.
   
   
   > If you have a proposal of mapping 
`YearMonthIntervalType`/`DayTimeIntervalType` to python types (from standard 
lib) like we did for Java/Scala already:
   
   I'll take a look when I have a chance, but if I am not mistaken equivalents 
of these are already supported in Arrow, so that's probably where we should 
start looking.
   
   If you don't mind a QQ about the of future of `CalendarIntervalType` ‒ can 
it be decomposed into `YearMonthIntervalType` + `DayTimeIntervalType`?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to