tustvold commented on issue #2387: URL: https://github.com/apache/arrow-rs/issues/2387#issuecomment-1209424280
> Which means we have to do validation at runtime to check overflow. Given we don't care about this for other types, because it has serious performance implications, why would we care about it for decimals? I guess my question can be phrased as: > Given decimal overflow cannot lead to undefined behaviour why do we need to check precision A somewhat related point, is that in Rust signed integer overflow in rust is **not** undefined behaviour and is explicitly defined to wrap as twos complement, this is because checking at runtime is prohibitively expensive. I don't see an obvious reason we should treat decimals any differently, the overflow behaviour is perfectly well defined -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
