maccamlc commented on a change in pull request #884:
URL: https://github.com/apache/avro/pull/884#discussion_r427673945
##########
File path: lang/java/avro/src/main/java/org/apache/avro/Conversions.java
##########
@@ -117,6 +115,25 @@ public GenericFixed toFixed(BigDecimal value, Schema
schema, LogicalType type) {
return new GenericData.Fixed(schema, bytes);
}
+
+ private static BigDecimal validate(LogicalTypes.Decimal decimal,
BigDecimal value) {
+ int scale = decimal.getScale();
+ int valueScale = value.scale();
+ if (valueScale > scale) {
+ throw new AvroTypeException("Cannot encode decimal with scale " +
valueScale + " as scale " + scale);
+ } else if (valueScale < scale) {
+ value = value.setScale(scale, ROUND_UNNECESSARY);
+ }
+
+ int precision = decimal.getPrecision();
+ int valuePrecision = value.precision();
Review comment:
Have noticed in my testing, that a fixed 12 size type, can have a max
precision of 28. However currently there are some 29 precision decimals that
would fit inside the 12 byte array, and therefore are currently permitted
without the offset error I mentioned in issue description.
This change would mean that 29 precision decimal would now error.
My thinking is that the fixed type config should match the actual value, and
therefore checking precision makes sense. But happy to understand any
counter-arguments
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]