ulysses-you commented on PR #38760: URL: https://github.com/apache/spark/pull/38760#issuecomment-1408442961
> Do you know when we start to have this bug? It happened in branch-3.4 after we refactor decimal binary operater. > And do we ever support decimal(0, 0)? like in CREATE TABLE and CAST? It's more complex and is a long time issue. In short, Spark does not validate and fail if the presicion is 0 when create table or cast expression. But the dependency(hive/parquet) did it. ```sql -- work with in-memory catalog create table t (c decimal(0, 0)) using parquet; -- fail with parquet -- java.lang.IllegalArgumentException: Invalid DECIMAL precision: 0 -- at org.apache.parquet.Preconditions.checkArgument(Preconditions.java:57) insert into table t values(0); -- fail with hive catalog -- Caused by: java.lang.IllegalArgumentException: Decimal precision out of allowed range [1,38] -- at org.apache.hadoop.hive.serde2.typeinfo.HiveDecimalUtils.validateParameter(HiveDecimalUtils.java:44) create table t (c decimal(0, 0)) using parquet; ``` So I think we should fail if precision is 0. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
