[
https://issues.apache.org/jira/browse/NIFI-10993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Paul Grey reassigned NIFI-10993:
--------------------------------
Assignee: Paul Grey
> PublishKafkaRecord should write key record (when configured) using correct
> schema
> ---------------------------------------------------------------------------------
>
> Key: NIFI-10993
> URL: https://issues.apache.org/jira/browse/NIFI-10993
> Project: Apache NiFi
> Issue Type: Bug
> Reporter: Paul Grey
> Assignee: Paul Grey
> Priority: Minor
>
> community report via
> (https://www.mail-archive.com/[email protected]/msg15668.html)
> to [email protected]
> To whom it may concern,
> Hello, I would like to report an issue for Nifi. But, following the new Jira
> Guidelines, I would therefore like to request that an account for ASF Jira in
> order to create a ticket.
> In regards to the bug, using Nifi 1.19.1 I would like to send a tombstone
> message (null payload) to Kafka and using the Confluent JDBC sink connector
> (with delete.enabled=true) to delete a record in our Postgres database. I
> believe as of Nifi 1.19, PublishKafkaRecord_2_6 now supports the 'Publish
> Strategy: Use Wrapper' feature which allows setting the Kafka message key and
> value (Primary Key as the Kafka key, null for the Kafka value). For the
> Record Key Writer, I'm using an AvroRecordSetWriter to validate and serialize
> the key against the confluent schema registry (Schema Write Strategy:
> Confluent Schema Registry Reference, Schema Access Strategy: Use 'Schema
> Name' Property) but sending the message I come across the error:
> PublishKafkaRecord_2_6[id=XXX] Failed to send FlowFile[filename=XXX] to
> Kafka: org.apache.nifi.processor.exception.ProcessException: Could not
> determine the Avro Schema to use for writing the content
> - Caused by: org.apache.nifi.schema.access.SchemaNotFoundException: Cannot
> write Confluent Schema Registry Reference because the Schema Identifier is
> not known
> I can confirm the configuration for the for the AvroRecordSetWriter,
> ConfluentSchemaRegistry controllers, and PublishKafkaRecord processor are all
> configured correctly, as I can send the Kafka message just fine using the
> default Publish Strategy (Use Content as Record Value). It only occurs using
> Use Wrapper, and the ConfluentSchemaRegistry.
> A workaround that has worked was for using JsonRecordSetWriter w/ embedded
> JSON schemas, but it would be nice to continue using our Avro Schema Registry
> for this.
> I'd appreciate if someone had any advice or experience with this issue,
> otherwise I'd like to log an issue in JIRA.
> Thank you,
> Austin Tao
--
This message was sent by Atlassian Jira
(v8.20.10#820010)