Zhen Wang created SPARK-44902:
---------------------------------
Summary: The precision of LongDecimal is inconsistent with Hive.
Key: SPARK-44902
URL: https://issues.apache.org/jira/browse/SPARK-44902
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 3.4.0
Reporter: Zhen Wang
The precision of LongDecimal in Hive is 19 but it is 20 in Spark. This leads to
type conversion errors in some cases.
Relevant code:
[https://github.com/apache/spark/blob/4646991abd7f4a47a1b8712e2017a2fae98f7c5a/sql/api/src/main/scala/org/apache/spark/sql/types/DecimalType.scala#L129|https://github.com/apache/spark/blob/4646991abd7f4a47a1b8712e2017a2fae98f7c5a/sql/api/src/main/scala/org/apache/spark/sql/types/DecimalType.scala#L129C51-L129C51]
[https://github.com/apache/hive/blob/3d3acc7a19399d749a39818573a76a0dbbaf2598/serde/src/java/org/apache/hadoop/hive/serde2/typeinfo/HiveDecimalUtils.java#L76]
Reproduce:
create table and view in hive:
{code:java}
create table t (value bigint);
create view v as select value * 0.1 from t; {code}
read in spark:
{code:java}
select * from v; {code}
error occurred:
{code:java}
org.apache.spark.sql.AnalysisException: [CANNOT_UP_CAST_DATATYPE] Cannot up
cast `(value * 0.1)` from "DECIMAL(22,1)" to "DECIMAL(21,1)".The type path of
the target object is:
You can either add an explicit cast to the input data or choose a higher
precision type of the field in the target object at
org.apache.spark.sql.errors.QueryCompilationErrors$.upCastFailureError(QueryCompilationErrors.scala:285)
at
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveUpCast$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveUpCast$$fail(Analyzer.scala:3627)
at
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveUpCast$$anonfun$apply$57$$anonfun$applyOrElse$235.applyOrElse(Analyzer.scala:3658)
at
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveUpCast$$anonfun$apply$57$$anonfun$applyOrElse$235.applyOrElse(Analyzer.scala:3635)
{code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]