mgaido91 commented on issue #27629: [SPARK-28067][SQL]Fix incorrect results 
during aggregate sum for decimal overflow by throwing exception
URL: https://github.com/apache/spark/pull/27629#issuecomment-588495360
 
 
   This PR would introduce regressions. Checking every sum means that temporary 
overflows would cause an exception. Eg., if you sum MAX_INT, 10, -100, then 
MAX_INT + 10 would cause the exception. In the current code, this sum is 
handled properly and returns the correct result, because the temporary overflow 
is fixed by summing -100. So we would raise exceptions even when not needed. 
IIRC, other DBs treat this properly, so temporary overflow don't cause 
exceptions.
   
   The proper fix for this would be to use as buffer a larger data type than 
the returned one. I remember I had a PR for that 
(https://github.com/apache/spark/pull/25347). You can check the comments and 
history of it.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to