[ 
https://issues.apache.org/jira/browse/SPARK-35999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17375262#comment-17375262
 ] 

Apache Spark commented on SPARK-35999:
--------------------------------------

User 'sarutak' has created a pull request for this issue:
https://github.com/apache/spark/pull/33226

> Make from_csv/to_csv to handle day-time intervals properly
> ----------------------------------------------------------
>
>                 Key: SPARK-35999
>                 URL: https://issues.apache.org/jira/browse/SPARK-35999
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.2.0, 3.3.0
>            Reporter: Kousuke Saruta
>            Assignee: Kousuke Saruta
>            Priority: Major
>
> from_csv throws exception if day-time interval types are given.
> {code}
> spark-sql> select from_csv("interval '1 2:3:4' day to second", "a interval 
> day to second");
> 21/07/03 04:39:13 ERROR SparkSQLDriver: Failed in [select from_csv("interval 
> '1 2:3:4' day to second", "a interval day to second")]
> java.lang.Exception: Unsupported type: interval day to second
>  at 
> org.apache.spark.sql.errors.QueryExecutionErrors$.unsupportedTypeError(QueryExecutionErrors.scala:775)
>  at 
> org.apache.spark.sql.catalyst.csv.UnivocityParser.makeConverter(UnivocityParser.scala:224)
>  at 
> org.apache.spark.sql.catalyst.csv.UnivocityParser.$anonfun$valueConverters$1(UnivocityParser.scala:134)
>  {code}
> Also, to_csv doesn't handle day-time interval types properly though any 
> exception is thrown.
> The result of to_csv for day-time interval types is not ANSI interval 
> compliant form.
> {code}
> spark-sql> select to_csv(named_struct("a", interval '1 2:3:4' day to second));
> 93784000000
> {code}
> The result above should be INTERVAL '1 02:03:04' DAY TO SECOND.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to