Github user mgaido91 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20350#discussion_r163008946
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/types/Decimal.scala ---
    @@ -237,14 +238,26 @@ final class Decimal extends Ordered[Decimal] with 
Serializable {
       /**
        * Create new `Decimal` with given precision and scale.
        *
    -   * @return a non-null `Decimal` value if successful or `null` if 
overflow would occur.
    +   * @return a non-null `Decimal` value if successful. Otherwise, if 
`nullOnOverflow` is true, null
    +   *         is returned; if `nullOnOverflow` is false, an 
`ArithmeticException` is thrown.
        */
       private[sql] def toPrecision(
           precision: Int,
           scale: Int,
    -      roundMode: BigDecimal.RoundingMode.Value = ROUND_HALF_UP): Decimal = 
{
    +      roundMode: BigDecimal.RoundingMode.Value = ROUND_HALF_UP,
    +      nullOnOverflow: Boolean = true): Decimal = {
         val copy = clone()
    -    if (copy.changePrecision(precision, scale, roundMode)) copy else null
    +    if (copy.changePrecision(precision, scale, roundMode)) {
    +      copy
    +    } else {
    +      val message = s"$toDebugString cannot be represented as 
Decimal($precision, $scale)."
    +      if (nullOnOverflow) {
    +        logWarning(s"$message NULL is returned.")
    --- End diff --
    
    I see your point. And I agree with you. But I wanted to put some traces of 
what was happening What about using DEBUG as log level? In this case most of 
the time we are not logging anything, but if we want to check is an overflow is 
happening we can. What do you think?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to