[
https://issues.apache.org/jira/browse/NIFI-3739?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15991052#comment-15991052
]
ASF GitHub Bot commented on NIFI-3739:
--------------------------------------
Github user joewitt commented on a diff in the pull request:
https://github.com/apache/nifi/pull/1695#discussion_r114151657
--- Diff:
nifi-nar-bundles/nifi-kafka-bundle/nifi-kafka-0-10-processors/src/main/java/org/apache/nifi/processors/kafka/pubsub/ConsumerLease.java
---
@@ -408,53 +424,63 @@ private void writeRecordData(final ProcessSession
session, final List<ConsumerRe
flowFile = session.write(flowFile, rawOut -> {
final Iterator<ConsumerRecord<byte[], byte[]>> itr =
records.iterator();
- try (final OutputStream out = new
BufferedOutputStream(rawOut)) {
- final RecordSchema emptySchema = new
SimpleRecordSchema(Collections.emptyList());
- final RecordSet recordSet = new RecordSet() {
- @Override
- public RecordSchema getSchema() throws IOException
{
- return emptySchema;
- }
-
- @Override
- public Record next() throws IOException {
- if (!itr.hasNext()) {
- return null;
- }
+ final RecordSchema emptySchema = new
SimpleRecordSchema(Collections.emptyList());
+ final RecordSet recordSet = new RecordSet() {
+ @Override
+ public RecordSchema getSchema() throws IOException {
+ return emptySchema;
+ }
+ @Override
+ public Record next() throws IOException {
+ while (itr.hasNext()) {
final ConsumerRecord<byte[], byte[]>
consumerRecord = itr.next();
final InputStream in = new
ByteArrayInputStream(consumerRecord.value());
try {
final RecordReader reader =
readerFactory.createRecordReader(ff, in, logger);
final Record record = reader.nextRecord();
return record;
- } catch (final SchemaNotFoundException |
MalformedRecordException e) {
- throw new IOException(e);
+ } catch (final Exception e) {
+ FlowFile failureFlowFile =
session.create();
--- End diff --
lets add the attributes we know for this message (topic/partition/offset)
> Create Processors for publishing records to and consuming records from Kafka
> ----------------------------------------------------------------------------
>
> Key: NIFI-3739
> URL: https://issues.apache.org/jira/browse/NIFI-3739
> Project: Apache NiFi
> Issue Type: New Feature
> Components: Extensions
> Reporter: Mark Payne
> Assignee: Mark Payne
> Fix For: 1.2.0
>
>
> With the new record readers & writers that have been added in now, it would
> be good to allow records to be pushed to and pulled from kafka. Currently, we
> support demarcated data but sometimes we can't correctly demarcate data in a
> way that keeps the format valid (json is a good example). We should have
> processors that use the record readers and writers for this.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)