[
https://issues.apache.org/jira/browse/NIFI-5592?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16616276#comment-16616276
]
ASF subversion and git services commented on NIFI-5592:
-------------------------------------------------------
Commit c8fc1327eea046b03f47180334eeeb603bc4ea39 in nifi's branch
refs/heads/master from [~markap14]
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=c8fc132 ]
NIFI-5592: If an Exception is thrown by RecordReader.read() from
ConsumeKafkaRecord, route Record to parse.failure relationship
Signed-off-by: Pierre Villard <[email protected]>
This closes #3001.
> ConsumeKafkaRecord* processors can stop pulling data if the data doesn't
> match the configured schema
> ----------------------------------------------------------------------------------------------------
>
> Key: NIFI-5592
> URL: https://issues.apache.org/jira/browse/NIFI-5592
> Project: Apache NiFi
> Issue Type: Bug
> Components: Extensions
> Reporter: Mark Payne
> Assignee: Mark Payne
> Priority: Major
> Fix For: 1.8.0
>
>
> If the data in the kafka topic does not adhere to the configured schema, the
> processor should route the data to 'parse.failure' but in some conditions, we
> may encounter the following SchemaValidationException:
> {code}
> 2018-09-13 07:37:54,196 ERROR [Timer-Driven Process Thread-1]
> o.a.n.p.k.pubsub.ConsumeKafkaRecord_1_0
> ConsumeKafkaRecord_1_0[id=c258fa20-0165-1000-ffff-ffffb401d2c7] Exception
> while processing data from kafka so will close the lease
> org.apache.nifi.processors.kafka.pubsub.ConsumerPool$SimpleConsumerLease@3fa54755
> due to org.apache.nifi.processor.exception.ProcessException:
> org.apache.nifi.serialization.SchemaValidationException: Field designation
> cannot be null: org.apache.nifi.processor.exception.ProcessException:
> org.apache.nifi.serialization.SchemaValidationException: Field designation
> cannot be null
> org.apache.nifi.processor.exception.ProcessException:
> org.apache.nifi.serialization.SchemaValidationException: Field designation
> cannot be null
> at
> org.apache.nifi.processors.kafka.pubsub.ConsumerLease.writeRecordData(ConsumerLease.java:587)
> at
> org.apache.nifi.processors.kafka.pubsub.ConsumerLease.lambda$processRecords$2(ConsumerLease.java:330)
> at java.util.HashMap$KeySpliterator.forEachRemaining(HashMap.java:1553)
> at
> java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580)
> at
> org.apache.nifi.processors.kafka.pubsub.ConsumerLease.processRecords(ConsumerLease.java:317)
> at
> org.apache.nifi.processors.kafka.pubsub.ConsumerLease.poll(ConsumerLease.java:178)
> at
> org.apache.nifi.processors.kafka.pubsub.ConsumeKafkaRecord_1_0.onTrigger(ConsumeKafkaRecord_1_0.java:378)
> at
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> at
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
> at
> org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
> at
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.nifi.serialization.SchemaValidationException: Field
> designation cannot be null
> at
> org.apache.nifi.serialization.record.MapRecord.checkTypes(MapRecord.java:81)
> at org.apache.nifi.serialization.record.MapRecord.<init>(MapRecord.java:52)
> at org.apache.nifi.csv.CSVRecordReader.nextRecord(CSVRecordReader.java:113)
> at
> org.apache.nifi.serialization.RecordReader.nextRecord(RecordReader.java:50)
> at
> org.apache.nifi.processors.kafka.pubsub.ConsumerLease.writeRecordData(ConsumerLease.java:534)
> ... 17 common frames omitted
> {code}
> In such a case, it will constantly roll back the session and keep trying to
> pull the data, with the Exception continually occurring.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)