Github user mgaido91 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22494#discussion_r219492191
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
    @@ -1345,6 +1345,16 @@ object SQLConf {
           .booleanConf
           .createWithDefault(true)
     
    +  val LITERAL_PRECISE_PRECISION =
    +    buildConf("spark.sql.literal.precisePrecision")
    +      .internal()
    +      .doc("When integral literals are used with decimals in binary 
operators, Spark will " +
    +        "pick a precise precision for the literals to calculate the 
precision and scale " +
    +        "of the result decimal, when this config is true. By picking a 
precise precision, we " +
    +        "can avoid wasting precision, to reduce the possibility of 
overflow.")
    --- End diff --
    
    `to reduce the possibility of overflow` actually this is not true and 
depends on the value of `DECIMAL_OPERATIONS_ALLOW_PREC_LOSS`, If 
`DECIMAL_OPERATIONS_ALLOW_PREC_LOSS` is true, the risk is to have a precision 
loss, but we don't overflow. If that is false, then this statement is right.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to