Hello all

I have a beam job that I use to read messages from RabbitMq t write them
in kafka.

As of now, messages are read/written as JSON.

Obviously, it's not that optimal storage, so i would like to transform
the messages to avro prior to write them in Kafka. I have the URL of a
schema registry I can use to store/get my schema.

But I see nowhere in Beam documentation how to transform my JSON into
Avro data (except by deserializing my JSON to a java class that i will
later transform into avro). Is that deserialization to class the only
way ? or is it possible to generate an avro generic record from my json
"directly" ?

Once my avro data is generated, how can I write it to my Kafka topic ?


Thanks !

Reply via email to