[
https://issues.apache.org/jira/browse/SPARK-34212?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17271071#comment-17271071
]
Dongjoon Hyun commented on SPARK-34212:
---------------------------------------
As a workaround, you may want to turn off
`spark.sql.hive.convertMetastoreParquet`.
{code:java}
spark-sql> set spark.sql.hive.convertMetastoreParquet=false;
spark.sql.hive.convertMetastoreParquet false
Time taken: 0.04 seconds, Fetched 1 row(s)
spark-sql> set spark.sql.hive.convertMetastoreParquet;
spark.sql.hive.convertMetastoreParquet false
Time taken: 0.038 seconds, Fetched 1 row(s)
spark-sql> select * from test_decimal;
21/01/24 21:20:34 WARN SessionState: METASTORE_FILTER_HOOK will be ignored,
since hive.security.authorization.manager is set to instance of
HiveAuthorizerFactory.
100.000
Time taken: 1.695 seconds, Fetched 1 row(s)
{code}
> For parquet table, after changing the precision and scale of decimal type in
> hive, spark reads incorrect value
> --------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-34212
> URL: https://issues.apache.org/jira/browse/SPARK-34212
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.4.5, 3.1.1
> Reporter: Yahui Liu
> Priority: Major
>
> In Hive,
> {code}
> create table test_decimal(amt decimal(18,2)) stored as parquet;
> insert into test_decimal select 100;
> alter table test_decimal change amt amt decimal(19,3);
> {code}
> In Spark,
> {code}
> select * from test_decimal;
> {code}
> {code}
> +--------+
> | amt |
> +--------+
> | 10.000 |
> +--------+
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]