cloud-fan commented on pull request #29026:
URL: https://github.com/apache/spark/pull/29026#issuecomment-655259298


   > So now we have UnsafeRow.setDecimal silently returns a null for an 
overflowed decimal value in setDecimal, but getDecimal throws error. There is 
inconsistency here. Why is that ok?
   
   Correction: `UnsafeRow.setDecimal` returns void. This PR fixes 
`UnsafeRow.setDecimal` so that `getDecimal` can return null if the value is 
overflowed.
   
   > Earlier I think the decision was to not do the checking per row, but now 
dont we end up doing that in some of the cases
   
   Under ansi mode, you have to check overflow per-row, as it's done by the 
`Add` expression. This is not changed in this PR.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to