[
https://issues.apache.org/jira/browse/SPARK-4176?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14562690#comment-14562690
]
Cheng Lian commented on SPARK-4176:
-----------------------------------
No, this is about the Parquet schema conversion code path. Right now we simply
ignore decimals with precision > 18.
https://github.com/apache/spark/blob/e838a25bdb5603ef05e779225704c972ce436145/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTypes.scala#L77-L80
I'm bumping this to 1.5.0.
> Support decimals with precision > 18 in Parquet
> -----------------------------------------------
>
> Key: SPARK-4176
> URL: https://issues.apache.org/jira/browse/SPARK-4176
> Project: Spark
> Issue Type: New Feature
> Components: SQL
> Reporter: Matei Zaharia
> Assignee: Cheng Lian
>
> After https://issues.apache.org/jira/browse/SPARK-3929, only decimals with
> precisions <= 18 (that can be read into a Long) will be readable from
> Parquet, so we still need more work to support these larger ones.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]