cloud-fan commented on code in PR #45383:
URL: https://github.com/apache/spark/pull/45383#discussion_r1547288333


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala:
##########
@@ -702,26 +707,39 @@ abstract class TypeCoercionBase {
         }.getOrElse(b)  // If there is no applicable conversion, leave 
expression unchanged.
 
       case e: ImplicitCastInputTypes if e.inputTypes.nonEmpty =>
-        val children: Seq[Expression] = e.children.zip(e.inputTypes).map { 
case (in, expected) =>
+        val childrenBeforeCollations: Seq[Expression] = 
e.children.zip(e.inputTypes).map {
           // If we cannot do the implicit cast, just use the original input.
-          implicitCast(in, expected).getOrElse(in)
+          case (in, expected) => implicitCast(in, expected).getOrElse(in)
+        }
+        val st = getOutputCollation(e.children)
+        val children: Seq[Expression] = childrenBeforeCollations.map {

Review Comment:
   Note: if we want to have correlation between function inputs, please match 
the expression explicitly to do so. `ImplicitCastInputTypes` does not indicate 
input correlation and please DO NOT make such assumptions here.
   
   Looking at `ConcatWs`, it does need the inputs to use the same string 
collation. Let's match it in the new rule and deal with it correctly.
   
   In principle, let's not generalize things without seeing the full picture. 
What we should do to `ConcatWs` does not necessarily apply to all 
`ImplicitCastInputTypes` implementations.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to