Mohit,

If you look at the Provenance events that are emitted by the processor, they 
show the reason that the records
are considered invalid. Specifically, for this use case, it shows: The 
following 2 fields had values whose type did not match the schema: [/hs_kbps, 
/site_id]
It appears that in your incoming data, the values are integers, instead of 
doubles. If you have the "Strict Type Checking"
value set to "false" in the processor, then it should allow this. 
Unfortunately, though, it appears that there is a bug
that causes integer values not to be considered validate when the schema says 
that it is a double. I created a JIRA [1] for this.

In the meantime, if you update your schema to allow for those fields to be 
["null", "long", "double"] then you should be good.

Thanks
-Mark

[1] https://issues.apache.org/jira/browse/NIFI-5141


On May 2, 2018, at 11:57 AM, Mohit 
<[email protected]<mailto:[email protected]>> wrote:

Hi,
I’m using ValidateRecord processor to validate CSV and writing to avro.
For a file it is transferring all the record to an invalid relationship. It is 
working fine with ConvertCsvToAvro processor.

Avro Schema - 
{"type":"record","name":"cell_kpi_dump_geo","namespace":"cell_kpi_dump_geo","fields":[{"name":"month","type":["null","string"],"default":null},
{"name":"cell","type":["null","string"],"default":null},{"name":"availability","type":["int","null"],"default":0},{"name":"cssr_speech","type":["int","null"],"default":0},
{"name":"dcr_speech","type":["int","null"],"default":0},{"name":"hs_kbps","type":["double","null"],"default":0.0},{"name":"eul_kbps","type":["int","null"],"default":0},
{"name":"tech","type":["null","string"],"default":null},{"name":"site_id","type":["double","null"],"default":0.0},{"name":"longitude","type":["double","null"],"default":0.0},{"name":"latitude","type":["double","null"],"default":0.0}]}


Sample record –

May-16,KA4371D,95,100,0,151,,2G,4371,-1.606926,6.67223

Is there something I’m doing wrong?


Regards,
Mohit

Reply via email to