fe2s commented on code in PR #39099:
URL: https://github.com/apache/spark/pull/39099#discussion_r1058646060


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/types/Decimal.scala:
##########
@@ -374,7 +374,7 @@ final class Decimal extends Ordered[Decimal] with 
Serializable {
       if (scale < _scale) {
         // Easier case: we just need to divide our scale down

Review Comment:
   Yeah, my initial version was similar to that. But then I thought that we are 
kind of duplicating the same logic in two branches and I decided to merge them. 
I'm fine with both versions though. Does the following look good?
   
   ```scala
           if (diff > MAX_LONG_DIGITS) {
             lv = roundMode match {
               case ROUND_FLOOR => if (lv < 0) -1L else 0L
               case ROUND_CEILING => if (lv > 0) 1L else 0L
               case ROUND_HALF_UP | ROUND_HALF_EVEN => 0L
               case _ => throw 
QueryExecutionErrors.unsupportedRoundingMode(roundMode)
             }
           } else {
             original code
           }
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to