[
https://issues.apache.org/jira/browse/NIFI-4639?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16269749#comment-16269749
]
ASF GitHub Bot commented on NIFI-4639:
--------------------------------------
Github user MikeThomsen commented on a diff in the pull request:
https://github.com/apache/nifi/pull/2292#discussion_r153658243
--- Diff:
nifi-nar-bundles/nifi-kafka-bundle/nifi-kafka-0-10-processors/src/test/java/org/apache/nifi/processors/kafka/pubsub/TestPublisherLease.java
---
@@ -191,4 +204,37 @@ public Object answer(InvocationOnMock invocation)
throws Throwable {
verify(producer, times(1)).flush();
}
+
+ @Test
--- End diff --
It ran to completion without errors.
> PublishKafkaRecord with Avro writer: schema lost from output
> ------------------------------------------------------------
>
> Key: NIFI-4639
> URL: https://issues.apache.org/jira/browse/NIFI-4639
> Project: Apache NiFi
> Issue Type: Bug
> Components: Extensions
> Affects Versions: 1.4.0
> Reporter: Matthew Silverman
> Attachments: Demo_Names_NiFi_bug.xml
>
>
> I have a {{PublishKafkaRecord_0_10}} configured with an
> {{AvroRecordSetWriter}}, in turn configured to "Embed Avro Schema". However,
> when I consume data from the Kafka stream I recieve individual records that
> lack a schema header.
> As a workaround, I can send the flow files through a {{SplitRecord}}
> processor, which does embed the Avro schema into each resulting flow file.
> Comparing the code for {{SplitRecord}} and the {{PublishKafkaRecord}}
> processors, I believe the issue is that {{PublisherLease}} wipes the output
> stream after calling {{createWriter}}; however it is
> {{AvroRecordSetWriter#createWriter}} that writes the Avro header to the
> output stream. {{SplitRecord}}, on the other hand, creates a new writer for
> each output record.
> I've attached my flow.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)