Hi,

I am using Kafka/Kafka Connect to track certain events happening in my
application. This is how I have implemented it -
1. My application is opening a KafkaProducer every time this event happens
and writes to my topic. My application has several components running in
Yarn and so I did not find a way to have just one producer and reuse it.
Once the event has been published, producer is closed
2. I am using Kafka Connect Sink Connector to consume from my topic and
write to DB and do other processing.

This setup is working great as long as we have a stable number of events
published. The issue I am facing is when we have a huge number of events(in
thousands within minutes) hitting Kafka. In this case, my Sink Connector is
going into a loop and reading events from Kafka recursively and not
stopping. What could have triggered this? Please provide your valuable
insights.

Thanks,
Sri

Reply via email to