Ngone51 edited a comment on issue #26881: [SPARK-30252][SQL] Disallow negative 
scale of Decimal
URL: https://github.com/apache/spark/pull/26881#issuecomment-575472015
 
 
   I reverted the check for max precision added in `set(decimal: BigDecimal)` 
because it can break overflow check. That is, Spark is allowed to create a 
decimal which has precision larger than 38 and then overflow check will decide 
to return null or throw exception which depends on ansi. So, if we try to add 
check early in the `set`, then we'll get exception early too before we check 
overflow.
   
   For example, for the query below:
   
   `spark.sql("select cast(11111111111111111111.123 as decimal(23, 3)) * 
cast(99999999999999999999.123 as decimal(23, 3))").show`
   
   without max precision check, we'll get `null`; with it, we'll get an 
exception.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to