Github user cloud-fan commented on the issue:

    https://github.com/apache/spark/pull/22450
  
    I feel there are more places we need to fix for negative scale. I couldn't 
find any design doc for negative scale in Spark and I believe we supported it 
by accident.
    
    That said, fixing division is just fixing the specific case the user 
reported, which is not ideal. We should either officially support negative 
scale and fix all the cases, or officially forbid negative scale. However, 
neither of them can be made into a bug fix for branch 2.3 and 2.4.
    
    Instead, I'm proposing a different fix: un-officially forbids negative 
scale. Users can still create a decimal value with negative scale, but Spark 
itself should avoid generating such values. See 
https://github.com/apache/spark/pull/22470


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to