andygrove opened a new issue, #375: URL: https://github.com/apache/datafusion-comet/issues/375
### What is the problem the feature request solves? Comet is not consistent with Spark when casting between decimals. Here is a test to demonstrate this. ```scala test("cast between decimals with different precision and scale") { val rowData = Seq( Row(BigDecimal("12345.6789")), Row(BigDecimal("9876.5432")), Row(BigDecimal("123.4567")) ) val df = spark.createDataFrame( spark.sparkContext.parallelize(rowData), StructType(Seq(StructField("a", DataTypes.createDecimalType(10,4)))) ) castTest(df, DataTypes.createDecimalType(6,2)) } ``` ## Spark Result ``` +----------+-------+ | a| a| +----------+-------+ | 123.4567| 123.46| | 9876.5432|9876.54| |12345.6789| null| +----------+-------+ ``` ## Comet Result ``` java.lang.ArithmeticException: Cannot convert 12345.68 (bytes: [B@4f834a43, integer: 1234568) to decimal with precision: 6 and scale: 2 at org.apache.comet.vector.CometVector.getDecimal(CometVector.java:86) ``` ### Describe the potential solution _No response_ ### Additional context _No response_ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org For additional commands, e-mail: github-h...@datafusion.apache.org