kazuyukitanimura commented on code in PR #416:
URL: https://github.com/apache/datafusion-comet/pull/416#discussion_r1602009700


##########
spark/src/main/scala/org/apache/comet/serde/QueryPlanSerde.scala:
##########
@@ -617,50 +663,17 @@ object QueryPlanSerde extends Logging with 
ShimQueryPlanSerde with CometExprShim
           val value = cast.eval()
           exprToProtoInternal(Literal(value, dataType), inputs)
 
+        case UnaryExpression(child) if expr.prettyName == "trycast" =>

Review Comment:
   Just making sure whether it is `trycast`, not `try_cast`. Looks like Spark 
3.4 is using `try_cast`, haven't been able to find the source for Spark 3.3



##########
spark/src/main/scala/org/apache/comet/serde/QueryPlanSerde.scala:
##########
@@ -617,50 +663,17 @@ object QueryPlanSerde extends Logging with 
ShimQueryPlanSerde with CometExprShim
           val value = cast.eval()
           exprToProtoInternal(Literal(value, dataType), inputs)
 
+        case UnaryExpression(child) if expr.prettyName == "trycast" =>
+          val timeZoneId = SQLConf.get.sessionLocalTimeZone
+          handleCast(child, inputs, expr.dataType, Some(timeZoneId), "TRY")
+
         case Cast(child, dt, timeZoneId, evalMode) =>
-          val childExpr = exprToProtoInternal(child, inputs)
-          if (childExpr.isDefined) {
-            val evalModeStr = if (evalMode.isInstanceOf[Boolean]) {
-              // Spark 3.2 & 3.3 has ansiEnabled boolean
-              if (evalMode.asInstanceOf[Boolean]) "ANSI" else "LEGACY"
-            } else {
-              // Spark 3.4+ has EvalMode enum with values LEGACY, ANSI, and TRY
-              evalMode.toString
-            }
-            val castSupport =
-              CometCast.isSupported(child.dataType, dt, timeZoneId, 
evalModeStr)
-
-            def getIncompatMessage(reason: Option[String]) =
-              "Comet does not guarantee correct results for cast " +
-                s"from ${child.dataType} to $dt " +
-                s"with timezone $timeZoneId and evalMode $evalModeStr" +
-                reason.map(str => s" ($str)").getOrElse("")
-
-            castSupport match {
-              case Compatible(_) =>
-                castToProto(timeZoneId, dt, childExpr, evalModeStr)
-              case Incompatible(reason) =>
-                if (CometConf.COMET_CAST_ALLOW_INCOMPATIBLE.get()) {
-                  logWarning(getIncompatMessage(reason))
-                  castToProto(timeZoneId, dt, childExpr, evalModeStr)
-                } else {
-                  withInfo(
-                    expr,
-                    s"${getIncompatMessage(reason)}. To enable all 
incompatible casts, set " +
-                      s"${CometConf.COMET_CAST_ALLOW_INCOMPATIBLE.key}=true")
-                  None
-                }
-              case Unsupported =>
-                withInfo(
-                  expr,
-                  s"Unsupported cast from ${child.dataType} to $dt " +
-                    s"with timezone $timeZoneId and evalMode $evalModeStr")
-                None
-            }
+          val evalModeStr = if (evalMode.isInstanceOf[Boolean]) {
+            if (evalMode.asInstanceOf[Boolean]) "ANSI" else "LEGACY"

Review Comment:
   I would say, let's keep the comments `// Spark 3.2 & 3.3 has ansiEnabled 
boolean` and `// Spark 3.4+ has EvalMode enum with values LEGACY, ANSI, and 
TRY` so that when we drop older Spark versions, we can clean this up easily.
   
   Another approach would be moving this to the shims



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org
For additional commands, e-mail: github-h...@datafusion.apache.org

Reply via email to