[
https://issues.apache.org/jira/browse/SPARK-29391?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17277990#comment-17277990
]
Wenchen Fan commented on SPARK-29391:
-------------------------------------
If this is a common behavior in other databases, I think Spark should follow
it. IIRC PostgreSQL feature parity is not a requirement anymore.
> Default year-month units
> ------------------------
>
> Key: SPARK-29391
> URL: https://issues.apache.org/jira/browse/SPARK-29391
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Maxim Gekk
> Priority: Major
>
> PostgreSQL can assume default year-month units by defaults:
> {code}
> maxim=# SELECT interval '1-2';
> interval
> ---------------
> 1 year 2 mons
> {code}
> but the same produces NULL in Spark:
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]