You have to either 1) use one of the Confluent serializers
<https://docs.confluent.io/current/schema-registry/serdes-develop/index.html#>
when you publish to the topic, so that the schema (or reference to it) is
included, or 2) write and use a custom converter
<https://kafka.apache.org/25/javadoc/org/apache/kafka/connect/storage/Converter.html>
that knows about the data schema and can take the kafka record value and
convert it into a kafka connect record (by implementing the toConnectData
converter method), which is what the sink connectors are driven from.

See https://docs.confluent.io/current/connect/concepts.html#converters

Chris



On Fri, May 8, 2020 at 6:59 AM vishnu murali <vishnumurali9...@gmail.com>
wrote:

> Hey Guys,
>
> I am *using Apache **2.5 *not confluent.
>
> i am trying to send data from topic to database using jdbc sink connector.
>
> we need to send that data with the appropriate schema also.
>
> i am *not using confluent version* of kafka.
>
> so can anyone explain how can i do this ?
>

Reply via email to