I have asked this question on stackoverflow:
https://stackoverflow.com/questions/71177535/apache-nifi-consumekafkarecord-2-6-consuming-message-from-topic-where-key-and-va

Basically I am trying to figure out if ConsumerKafkaRecord_2_6 can consume and 
deserialize from a kafka topic where both key and value where serialize using 
avro and the schemas stored in confluent schema registry. Right now with the 
setup that I have I get an error like below. Is there a way to specify that the 
messages are k,v type with both key and value avro serialized? Or do I need to 
code my own consumer?  Is there another way to process the message. This is 
fairly common with the convention that the schemas would be named  [topic 
name]-value and [topic-name]-key. I am able to read the messages using kvcat as 
follows:

kcat -b broker1:9092,broker2:9092,broker3:9092 -t mytopic -s avro -r 
http://schema-registry_url.com -p 0

Any help would be greatly appreciated.
-- 
 Sent with Tutanota, the secure & ad-free mailbox. 

Reply via email to