Hi,
I think what you have to do is the following:
1. Create your own DeserializationSchema. There, the deserialize() method
gets a byte[] for each message in Kafka
2. Deserialize the byte[] using the generated classes from protobuf.
3. If your datatype is called "Foo", there should be a generated
Hi Fabian
We are already using Flink to read json messages from kafka and index into
elasticsearch. Now we have a requirement to read protobuf messages from
kafka. I am new to protobuf and looking for help on how to deserialize
protobuf using flink from kafka consumer.
-Madhu
On Wed, Mar 9, 2016
Hi,
I haven't used protobuf to serialize Kafka events but this blog post (+ the
linked repository) shows how to write data from Flink into Elasticsearch:
-->
https://www.elastic.co/blog/building-real-time-dashboard-applications-with-apache-flink-elasticsearch-and-kibana
Hope this helps,
Fabian