[ 
https://issues.apache.org/jira/browse/NIFI-7909?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Burgess reassigned NIFI-7909:
----------------------------------

    Assignee: Matt Burgess

> ConvertRecord writes invalid data when converting long to int
> -------------------------------------------------------------
>
>                 Key: NIFI-7909
>                 URL: https://issues.apache.org/jira/browse/NIFI-7909
>             Project: Apache NiFi
>          Issue Type: Bug
>    Affects Versions: 1.11.4, 1.12.1
>            Reporter: Christophe Monnet
>            Assignee: Matt Burgess
>            Priority: Major
>
> [https://apachenifi.slack.com/archives/C0L9UPWJZ/p1602145019023300]
> {quote}I use ConvertRecord to convert from JSON to Avro.
>  For a field as "int" in the avro schema, if the json payload contains a 
> number that is too big, NiFi does not throw an error but writes "crap" in the 
> avro file. Is that intended?
> When using avrotool it throws an exception: 
> org.codehaus.jackson.JsonParseException: Numeric value (2156760545) out of 
> range of int
> The ConvertRecord is configured with JsonTreeReader (Infer Schema strategy) 
> and AvroRecordSetWriter (Use 'Schema Text' Property).
>  So I guess NiFi converts an inferred long to an explicitely specified int?
>  How can I make NiFi less lenient? I would prefer a failure than wrong data 
> in output.
> {quote}
> Workaround: use ValidateRecord.
> I'm also wondering if the ConsumeKafkaRecord processors could be affected.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to