mgaido91 commented on issue #25035: [SPARK-28235][SQL] Sum of decimals should 
return a decimal with MAX_PRECISION
URL: https://github.com/apache/spark/pull/25035#issuecomment-511330578
 
 
   Thanks for your comment @cloud-fan. 
   
   > Even if you set the precision to 38 it can still overflow.
   
   Yes that's true. But we'd support more use cases than now.
   
   > I don't think this is a better solution compared to p + 10, as it's 
already hard to hit overflow according to #25035 (comment)
   
   That's true, especially for a UT. But in a real environment, since Spark is 
used for Big Data applications, having ~ 11.000.000.000 records may happen. I 
think many Spark users have datasets greater that that. It's just hard to 
reproduce in a UT...

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to