HaoYang670 commented on issue #2387:
URL: https://github.com/apache/arrow-rs/issues/2387#issuecomment-1209401225

   > What is the purpose of the precision argument?
   
   As far as I know. The precision is actually the `decimal` precision, or how 
many digits are there in decimal representation.
   
   The `precision` add a runtime bound for the value. (The underlying bit width 
(128 or 256) is a compile time bound.). Whenever you change the value of 
`precision` at runtime, you might need to check the value validation.
   
   > why do we need to validate Decimal Precision at all. 
   
   Because Decimal type is somewhat a mixture of static type (128bits or 
256bits) and dynamic type (precision and scale). Which means we have to do 
validation at runtime to check overflow.
   
   >  I'm not really clear on what the implications of a value that exceeds the 
precision actually are.
   
   I guess the behavior is undefined. Because `False` can imply anything.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to