Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19576#discussion_r146979542
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/types/Decimal.scala ---
    @@ -234,31 +234,28 @@ final class Decimal extends Ordered[Decimal] with 
Serializable {
         changePrecision(precision, scale, ROUND_HALF_UP)
       }
     
    -  def changePrecision(precision: Int, scale: Int, mode: Int): Boolean = 
mode match {
    -    case java.math.BigDecimal.ROUND_HALF_UP => changePrecision(precision, 
scale, ROUND_HALF_UP)
    -    case java.math.BigDecimal.ROUND_HALF_EVEN => 
changePrecision(precision, scale, ROUND_HALF_EVEN)
    -  }
    -
       /**
        * Create new `Decimal` with given precision and scale.
        *
    -   * @return `Some(decimal)` if successful or `None` if overflow would 
occur
    +   * @return a non-null `Decimal` value if successful or `null` if 
overflow would occur.
        */
       private[sql] def toPrecision(
           precision: Int,
           scale: Int,
    -      roundMode: BigDecimal.RoundingMode.Value = ROUND_HALF_UP): 
Option[Decimal] = {
    +      roundMode: BigDecimal.RoundingMode.Value = ROUND_HALF_UP): Decimal = 
{
    --- End diff --
    
    `Option` is hard to use in java code(the codegen path), so I change the 
return type to nullable `Decimal`.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to