[
https://issues.apache.org/jira/browse/NIFI-7909?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17212702#comment-17212702
]
ASF subversion and git services commented on NIFI-7909:
-------------------------------------------------------
Commit 4c235f040562e2ca5b723bd3cc85be435c48682e in nifi's branch
refs/heads/main from Matt Burgess
[ https://gitbox.apache.org/repos/asf?p=nifi.git;h=4c235f0 ]
NIFI-7909: Change DataTypeUtils.toInteger() to use Math.toIntExact()
This closes #4596
Signed-off-by: Mike Thomsen <[email protected]>
> ConvertRecord writes invalid data when converting long to int
> -------------------------------------------------------------
>
> Key: NIFI-7909
> URL: https://issues.apache.org/jira/browse/NIFI-7909
> Project: Apache NiFi
> Issue Type: Bug
> Reporter: Christophe Monnet
> Assignee: Matt Burgess
> Priority: Major
> Time Spent: 20m
> Remaining Estimate: 0h
>
> [https://apachenifi.slack.com/archives/C0L9UPWJZ/p1602145019023300]
> {quote}I use ConvertRecord to convert from JSON to Avro.
> For a field as "int" in the avro schema, if the json payload contains a
> number that is too big, NiFi does not throw an error but writes "crap" in the
> avro file. Is that intended?
> When using avrotool it throws an exception:
> org.codehaus.jackson.JsonParseException: Numeric value (2156760545) out of
> range of int
> The ConvertRecord is configured with JsonTreeReader (Infer Schema strategy)
> and AvroRecordSetWriter (Use 'Schema Text' Property).
> So I guess NiFi converts an inferred long to an explicitely specified int?
> How can I make NiFi less lenient? I would prefer a failure than wrong data
> in output.
> {quote}
> Workaround: use ValidateRecord.
> I'm also wondering if the ConsumeKafkaRecord processors could be affected.
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)