GitHub user dmateusp opened a pull request:
https://github.com/apache/spark/pull/21706
[SPARK-24702] Fix Unable to cast to calendar interval in spark sql
## What changes were proposed in this pull request?
Making the `calendarinterval` a parse-able DataType keyword to allow for
casting use-cases in SQL
## How was this patch tested?
Added a parser test in `sql.caralyst.parser.DataTypeParserSuite`
Before:
```DataType calendarinterval is not supported.(line 1, pos 48)
== SQL ==
select cast(cast(interval '1' day as string) as calendarinterval)```
After:
```scala
scala> spark.sql("select cast(cast(interval '1' day as string) as
calendarinterval)")
res0: org.apache.spark.sql.DataFrame = [CAST(CAST(interval 1 days AS
STRING) AS CALENDARINTERVAL): calendarinterval]
```
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/dmateusp/spark SPARK-24702_calendar_interval
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/21706.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #21706
----
commit 26c7dd183b41c0fd7da44433afa1be3b68ac5ab1
Author: Daniel Pires <dmateusp@...>
Date: 2018-07-03T14:25:03Z
Adding CalendarInterval in sql.catalyst.parser; it is a supported data
type, but wasn't parseable hence casting to an interval was throwing a
ParseException
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]