Hello everyone,


I have a problem with using the data stream connector to Kafka.



I have a specific topic with multiple partitions in my Kafka broker. This
topic contains IoT sensors readings from devices and I have to use the key
in the topic to maintain the order.



The problem is when I want to add a data stream in StreamPipes (ver 0.67.0)
from my broker, it shows an error of SerializationException in schema
reading.



"org.apache.kafka.common.errors.SerializationException: Error deserializing
key/value for partition ... "



Then,  I tried to make a copy of this topic, except this time the key was
not used. The result was it goes smoothly and the data can be read inside
the pipeline editor.



So, how are we supposed to use this stream connector? Could we use the
record key in the Kafka broker? And if we couldn't, what is the
alternative?



Greetings,

Yudistira

Reply via email to