Dear Beams,

When sending Avro values to Kafka, say GenericRecord, you typically specify
option value.serializer as
"io.confluent.kafka.serializers.KafkaAvroSerializer".  This along with a
bunch of other options for authentication and so on verifies the schema
stored in the Avro record with a schema registry.   Unfortunately, I
couldn't figure out how to pass this serializer class to KafkaIO.write() as
it's not acceptable to the withValueSerializer() method.

For KafkaIO.read() we made a specific provision in the form of
class ConfluentSchemaRegistryDeserializer but it doesn't look like we
covered the producer side of Avro values yet.

I'd be happy to dive into the code to add proper support for a Confluent
schema registry in KafkaIO.write() but I was just wondering if there was
something I might have overlooked.  It's hard to find samples or
documentation on producing Avro messages with Beam.

Thanks in advance,

Matt

Reply via email to