coderfender commented on code in PR #2472:
URL: https://github.com/apache/datafusion-comet/pull/2472#discussion_r2389664692


##########
spark/src/test/scala/org/apache/comet/CometCastSuite.scala:
##########
@@ -322,8 +322,7 @@ class CometCastSuite extends CometTestBase with 
AdaptiveSparkPlanHelper {
     castTest(generateInts(), DataTypes.DoubleType)
   }
 
-  ignore("cast IntegerType to DecimalType(10,2)") {
-    // Comet should have failed with [NUMERIC_VALUE_OUT_OF_RANGE] -1117686336 
cannot be represented as Decimal(10, 2)
+  test("cast IntegerType to DecimalType(10,2)") {

Review Comment:
   @parthchandra  , I looked into 
[DataGenerator.scala](https://github.com/apache/datafusion-comet/blob/eea40ca1526fcbf7c6b811bba8d28289b00be09f/spark/src/test/scala/org/apache/comet/DataGenerator.scala#L99)
 and it seems like we are testing for random int values along with adding 
`Int.MIN` and `Int.MAX` values which are well beyond `Decimal(10,2)` 's 
precision .  I even added a special test case (not committed to the branch yet) 
but I almost feel it is redundant at the moment. Please advice if you think it 
is a good idea to add the below test to explicitly verify overflow values only:
   
   ```
     test("cast IntegerType to DecimalType(10,2) overflowable check") {
       val intToDecimal10OverflowValues = withNulls(Seq(Int.MinValue, 
-100000000, -100000001, 100000000, 100000001, Int.MaxValue)).toDF("a")
       castTest(intToDecimal10OverflowValues, DataTypes.createDecimalType(10, 
2))
     }
   ``` 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to