To whom it may concern,
Hello, I would like to report an issue for Nifi. But, following the new Jira 
Guidelines, I would therefore like to request that an account for ASF Jira in 
order to create a ticket.
In regards to the bug, using Nifi 1.19.1 I would like to send a tombstone 
message (null payload) to Kafka and using the Confluent JDBC sink connector 
(with delete.enabled=true) to delete a record in our Postgres database. I 
believe as of Nifi 1.19, PublishKafkaRecord_2_6 now supports the 'Publish 
Strategy: Use Wrapper' feature which allows setting the Kafka message key and 
value (Primary Key as the Kafka key, null for the Kafka value). For the Record 
Key Writer, I'm using an AvroRecordSetWriter to validate and serialize the key 
against the confluent schema registry (Schema Write Strategy: Confluent Schema 
Registry Reference, Schema Access Strategy: Use 'Schema Name' Property) but 
sending the message I come across the error:PublishKafkaRecord_2_6[id=XXX] 
Failed to send FlowFile[filename=XXX] to Kafka: 
org.apache.nifi.processor.exception.ProcessException: Could not determine the 
Avro Schema to use for writing the content
- Caused by: org.apache.nifi.schema.access.SchemaNotFoundException: Cannot 
write Confluent Schema Registry Reference because the Schema Identifier is not 
known

I can confirm the configuration for the for the AvroRecordSetWriter, 
ConfluentSchemaRegistry controllers, and PublishKafkaRecord processor are all 
configured correctly, as I can send the Kafka message just fine using the 
default Publish Strategy (Use Content as Record Value). It only occurs using 
Use Wrapper, and the ConfluentSchemaRegistry.
A workaround that has worked was for using JsonRecordSetWriter w/ embedded JSON 
schemas, but it would be nice to continue using our Avro Schema Registry for 
this.
I'd appreciate if someone had any advice or experience with this issue, 
otherwise I'd like to log an issue in JIRA.
Thank you,Austin Tao

Reply via email to