PHILO-HE commented on code in PR #8107:
URL: https://github.com/apache/incubator-gluten/pull/8107#discussion_r1866983736
##########
gluten-substrait/src/main/scala/org/apache/gluten/expression/UnaryExpressionTransformer.scala:
##########
@@ -43,7 +44,10 @@ case class CastTransformer(substraitExprName: String, child:
ExpressionTransform
extends UnaryExpressionTransformer {
override def doTransform(args: java.lang.Object): ExpressionNode = {
val typeNode = ConverterUtils.getTypeNode(dataType, original.nullable)
- ExpressionBuilder.makeCast(typeNode, child.doTransform(args),
original.ansiEnabled)
+ ExpressionBuilder.makeCast(
+ typeNode,
+ child.doTransform(args),
+ SparkShimLoader.getSparkShims.withAnsiEvalMode(original))
Review Comment:
Yes, I note Spark also sets `ansiEnabled = true` for `EvalMode.TRY`, which
is for re-using the code logic of `EvalMode.ANSI`. The difference is, in TRY
mode any exception is caught and then NULL is returned.
Velox has different implementation, which requires us to set this flag
simply according to whether EvalMode is ANSI or not.
@acvictor, please leave some comments here to clarify, which should be
helpful for future code maintenance.
https://github.com/apache/spark/blob/v3.5.3/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala#L471
##########
gluten-ut/spark35/src/test/scala/org/apache/gluten/utils/velox/VeloxTestSettings.scala:
##########
@@ -92,6 +92,16 @@ class VeloxTestSettings extends BackendTestSettings {
.exclude(
"Process Infinity, -Infinity, NaN in case insensitive manner" // +inf
not supported in folly.
)
+ enableSuite[GlutenTryCastSuite]
+ .exclude(
+ "Process Infinity, -Infinity, NaN in case insensitive manner" // +inf
not supported in folly.
+ )
+ .exclude("ANSI mode: Throw exception on casting out-of-range value to byte
type")
Review Comment:
These tests are for ANSI ON.
@acvictor, for these tests in TryCastSuite, exceptions are not thrown, but
null results are returned? It should be an expected behavior for Velox cast.
I cannot figure out how these exception checks can pass in vanilla Spark,
assuming exception is handled internally in TRY mode.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]