Nitin003 opened a new issue, #10085:
URL: https://github.com/apache/incubator-gluten/issues/10085
### Backend
CH (ClickHouse)
### Bug description
Gluten Runtime exception when testing a decimal overflow scenario.
I am testing and comparing the behavior of vanilla spark and spark with
gluten (CH as native engine).
Below are some of the steps to test:
1. create a table
create table tbl_decimal_over(id decimal(38,38)) using parquet.
2. Insert sample test data to the table
insert into tbl_decimal_over values
(0.12345678900987654321123456789009876543),
(0.95400000000000000000000000000000000000)
3. select sum(id) from tbl_decimal_over;
**[Expected behavior]**
**vanilla spark -** should return null in the above case.
**gluten with CH as native -** should return null
**[actual behavior]**
**vanilla spark -** null (as expected).
**gluten with CH as native -** results in RuntimeException
java.lang.RuntimeException: Error while decoding: java.lang.Arithmetic:
Decimal precision 39 exceeds max precision 38.
### Gluten version
main branch
### Spark version
Spark-3.3.x
### Spark configurations
_No response_
### System information
_No response_
### Relevant logs
```bash
java.lang.RuntimeException: Error while decoding: java.lang.Arithmetic:
Decimal precision 39 exceeds max precision 38.
createexternalrow(input[0, decimal(38,38), true].toJavaBigDecimal,
StructField(sum(id), DecimalType(38,38), true))
at org.apache.spark.sql.errors.QueryExecutionErrors$.expressionDecodingError
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]