[
https://issues.apache.org/jira/browse/NIFI-4639?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16283551#comment-16283551
]
ASF GitHub Bot commented on NIFI-4639:
--------------------------------------
Github user markap14 commented on the issue:
https://github.com/apache/nifi/pull/2292
@joewitt @matthew-silverman I went ahead and updated the Kafka 1.0
publisher to follow the same pattern. Tested writing to kafka with Hortonworks
Content Encoded schemas, Hortonworks Attribute Schemas, Confluent Schema
Registry, and locally derived schemas. All appear to work as expected. Thanks
for the update @matthew-silverman ! I'll see if I can do something to improve
performance but as Joe said, correct is always better than fast!
> PublishKafkaRecord with Avro writer: schema lost from output
> ------------------------------------------------------------
>
> Key: NIFI-4639
> URL: https://issues.apache.org/jira/browse/NIFI-4639
> Project: Apache NiFi
> Issue Type: Bug
> Components: Extensions
> Affects Versions: 1.4.0
> Reporter: Matthew Silverman
> Fix For: 1.5.0
>
> Attachments: Demo_Names_NiFi_bug.xml
>
>
> I have a {{PublishKafkaRecord_0_10}} configured with an
> {{AvroRecordSetWriter}}, in turn configured to "Embed Avro Schema". However,
> when I consume data from the Kafka stream I recieve individual records that
> lack a schema header.
> As a workaround, I can send the flow files through a {{SplitRecord}}
> processor, which does embed the Avro schema into each resulting flow file.
> Comparing the code for {{SplitRecord}} and the {{PublishKafkaRecord}}
> processors, I believe the issue is that {{PublisherLease}} wipes the output
> stream after calling {{createWriter}}; however it is
> {{AvroRecordSetWriter#createWriter}} that writes the Avro header to the
> output stream. {{SplitRecord}}, on the other hand, creates a new writer for
> each output record.
> I've attached my flow.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)