maropu commented on a change in pull request #26485: [SPARK-29860][SQL] Fix 
dataType mismatch issue for InSubquery.
URL: https://github.com/apache/spark/pull/26485#discussion_r346066831
 
 

 ##########
 File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala
 ##########
 @@ -93,6 +93,14 @@ object TypeCoercion {
       Some(t2)
     case (t1: DecimalType, t2: IntegralType) if t1.isWiderThan(t2) =>
       Some(t1)
+    case (t1: DecimalType, t2: DecimalType) =>
+      // Handle two decimal type here and don't allow precision loss.
+      val widerType = DecimalPrecision.widerDecimalType(t1, t2)
+      if (widerType.isWiderThan(t1) && widerType.isWiderThan(t2)) {
+        Some(widerType)
+      } else {
+        None
+      }
 
 Review comment:
   This code looks suspicious... I personally think this issue should be fixed 
only in `InConversion` instead of `findTightestCommonType`. That's because the 
change of `findTightestCommonType` can affect type coercion in the other 
operations... cc: @mgaido91 @cloud-fan 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to