[
https://issues.apache.org/jira/browse/SPARK-9119?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Davies Liu resolved SPARK-9119.
-------------------------------
Resolution: Fixed
Fix Version/s: 1.5.0
Issue resolved by pull request 7925
[https://github.com/apache/spark/pull/7925]
> In some cases, we may save wrong decimal values to parquet
> ----------------------------------------------------------
>
> Key: SPARK-9119
> URL: https://issues.apache.org/jira/browse/SPARK-9119
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Reporter: Yin Huai
> Assignee: Davies Liu
> Priority: Blocker
> Fix For: 1.5.0
>
>
> {code}
> >
> import org.apache.spark.sql.Row
> import
> org.apache.spark.sql.types.{StructType,StructField,StringType,DecimalType}
> import org.apache.spark.sql.types.Decimal
>
> val schema = StructType(Array(StructField("name", DecimalType(10, 5), false)))
> val rowRDD = sc.parallelize(Array(Row(Decimal("67123.45"))))
> val df = sqlContext.createDataFrame(rowRDD, schema)
> df.registerTempTable("test")
> df.show()
>
> // +--------+
> // | name|
> // +--------+
> // |67123.45|
> // +--------+
> sqlContext.sql("create table testDecimal as select * from test")
> sqlContext.table("testDecimal").show()
> // +--------+
> // | name|
> // +--------+
> // |67.12345|
> // +--------+
> {code}
> The problem is when we do conversions, we do not use precision/scale info in
> the schema.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]