cloud-fan commented on a change in pull request #34038:
URL: https://github.com/apache/spark/pull/34038#discussion_r714504782



##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/CheckAnalysis.scala
##########
@@ -401,15 +401,30 @@ trait CheckAnalysis extends PredicateHelper with 
LookupCatalog {
                     |the ${ordinalNumber(ti + 1)} table has 
${child.output.length} columns
                   """.stripMargin.replace("\n", " ").trim())
               }
+              val isUnion = operator.isInstanceOf[Union]
+              val dataTypesAreCompatibleFn = if (isUnion) {
+                // `TypeCoercion` takes care of type coercion already. If any 
columns or nested
+                // columns are not compatible, we detect it here and throw 
analysis exception.
+                val typeChecker = (dt1: DataType, dt2: DataType) => {
+                  !TypeCoercion.findWiderTypeForTwo(dt1.asNullable, 
dt2.asNullable).isEmpty

Review comment:
       I see, thanks for the explanation!
   
   I think it's not a good practice to rely on the code logic that is far away 
from the current code. Is it possible to update the type coercion rule and add 
implicit casts using best efforts?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to