Hello to everyone,

First I would like to give thanks in advantage to everyone willing to help
my in this issue.

I have been coding a custom kafka connector that read from the twitter api
and write in a kafka topic, I have been trying to use avro coding to write
those message and i found that it was very difficult. The technologies used
were confluent docker kafka images 3.1.2 and kafka client 0.10.1.0. For
generating the avro object I used the maven plugin avro-maven-plugin, that
generates the object with the schema correctly. But, I have found that, for
creating the source record, I have to replicate the same schema in java
code creating a custom "org.apache.kafka.connect.data.Schema" and then,
inform a org.apache.kafka.connect.data.Struct object with all the
information. As a value converter class, I use the AvroConverter, and
reviewing the code, the only form to send a custom object, is sending it as
Struct object, it the, create the avro format and use it for encoding the
object. I don't understand why it can not digest my custom class created by
the avro maven plugin to serialize the object. I dont now if this is the
correct form to create a connector with avro serializer.

Could I get some feedbacks about this solution? Thanks to everyone. Xoan.

Reply via email to