Re: Protobuf + Confluent Schema Registry support

2021-06-30 Thread Austin Cawley-Edwards
Hi Vishal,

I don't believe there is another way to solve the problem currently besides
rolling your own serializer.

For the Avro + Schema Registry format, is this Table API format[1] what
you're referring to? It doesn't look there have been discussions around
adding a similar format for Protobuf yet, but perhaps you could start one
based on the avro one[2]?

Best,
Austin

[1]:
https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/connectors/table/formats/avro-confluent/
[2]:
https://issues.apache.org/jira/browse/FLINK-11160?jql=project%20%3D%20FLINK%20AND%20text%20~%20%22avro%20schema%22


On Wed, Jun 30, 2021 at 4:50 AM Vishal Surana  wrote:

> Using the vanilla kafka producer, I can write protobuf messages to kafka
> while leveraging schema registry support as well. A flink kafka producer
> requires us to explicity provide a serializer which converts the message to
> a producerrecord containing the serialized bytes of the message. We can't
> make use of the KafkaProtoobufSerializer[T] provided by Confluent. Thus the
> only way I could think of would be to create an instance of
> KafkaProtobufSerializer inside a FlinkSerializationSchema class and use it
> to serialize my messages. The problem with that would be that I would have
> to implement registration of the schema and other tasks done by
> KafkaProtobufSerializer.
>
> Is there any other way to solve this problem?
> Is there a plan to support protobuf serialization along with schema
> registry support?
> I noticed you've recently added Avro + Schema Registry support to your
> codebase but haven't documented it. Is it ready for use?
>


Protobuf + Confluent Schema Registry support

2021-06-30 Thread Vishal Surana
Using the vanilla kafka producer, I can write protobuf messages to kafka while 
leveraging schema registry support as well. A flink kafka producer requires us 
to explicity provide a serializer which converts the message to a 
producerrecord containing the serialized bytes of the message. We can't make 
use of the KafkaProtoobufSerializer[T] provided by Confluent. Thus the only way 
I could think of would be to create an instance of KafkaProtobufSerializer 
inside a FlinkSerializationSchema class and use it to serialize my messages. 
The problem with that would be that I would have to implement registration of 
the schema and other tasks done by KafkaProtobufSerializer. 

Is there any other way to solve this problem? 
Is there a plan to support protobuf serialization along with schema registry 
support?
I noticed you've recently added Avro + Schema Registry support to your codebase 
but haven't documented it. Is it ready for use?