itholic commented on code in PR #38644:
URL: https://github.com/apache/spark/pull/38644#discussion_r1021424319


##########
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CastWithAnsiOnSuite.scala:
##########
@@ -242,9 +242,13 @@ class CastWithAnsiOnSuite extends CastSuiteBase with 
QueryErrorsBase {
   test("Fast fail for cast string type to decimal type in ansi mode") {
     checkEvaluation(cast("12345678901234567890123456789012345678", 
DecimalType(38, 0)),
       Decimal("12345678901234567890123456789012345678"))
-    checkExceptionInExpression[ArithmeticException](
-      cast("123456789012345678901234567890123456789", DecimalType(38, 0)),
-      "Out of decimal type range")
+    checkError(
+      exception = intercept[SparkArithmeticException] {
+        evaluateWithoutCodegen(cast("123456789012345678901234567890123456789", 
DecimalType(38, 0)))
+      },
+      errorClass = "NUMERIC_OUT_OF_SUPPORTED_RANGE",
+      parameters = Map("value" -> "123456789012345678901234567890123456789")
+    )

Review Comment:
   CI complains `org.scalatest.exceptions.TestFailedException: Expected 
exception org.apache.spark.SparkArithmeticException to be thrown, but no 
exception was thrown
   `, but this is passed in my local test env both with/without ANSI mode.
   
   Any suggestion for this fix? or we should just use 
`checkExceptionInExpression for now` ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to