ulysses-you commented on code in PR #37832:
URL: https://github.com/apache/spark/pull/37832#discussion_r965636869


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/types/DecimalType.scala:
##########
@@ -94,12 +94,15 @@ case class DecimalType(precision: Int, scale: Int) extends 
FractionalType {
    */
   private[sql] def isTighterThan(other: DataType): Boolean = 
isTighterThanInternal(other)
 
-  @tailrec
   private def isTighterThanInternal(other: DataType): Boolean = other match {
     case dt: DecimalType =>
       (precision - scale) <= (dt.precision - dt.scale) && scale <= dt.scale
     case dt: IntegralType =>
-      isTighterThanInternal(DecimalType.forType(dt))
+      val integerAsDecimal = DecimalType.forType(dt)
+      assert(integerAsDecimal.scale == 0)
+      // If the precision equals `integerAsDecimal.precision`, there can be 
integer overflow
+      // during casting.
+      precision < integerAsDecimal.precision && scale == 0

Review Comment:
   good catch ! 
   
   it seems we can combine the methods `isTighterThan` and 
`isTighterThanInternal` since there is no recursion.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to