Github user dmateusp commented on the issue:
https://github.com/apache/spark/pull/21706
In the current Spark version I can run
```scala
scala> spark.sql("SELECT 'interval 1 hour' as
a").select(col("a").cast("calendarinterval")).show()
+----------------+
| a|
+----------------+
|interval 1 hours|
+----------------+
```
while `spark.sql("SELECT CALENDARINTERVAL('interval 1 hour') as a").show()`
throws an exception
I am not sure of what this PR is changing as to exposing it as an external
data type, it is just making the behavior consistent between raw SQL and
DataFrame API
Is the plan to remove `CalendarIntervalType` completely ?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]