Re: Kafka Consumer stops consuming from a topic

2016-07-19 Thread Abhinav Solan
Acks all ... Having one Kafka broker only On Tue, Jul 19, 2016, 9:22 AM David Garcia <dav...@spiceworks.com> wrote: > Ah ok. Another dumb question: what about acks? Are you using auto-ack? > > On 7/19/16, 10:00 AM, "Abhinav Solan" <abhinav.so...@gmail.com> wrote

Re: Kafka Consumer stops consuming from a topic

2016-07-19 Thread Abhinav Solan
If I add 2 more nodes and make it a cluster .. would that help ? Have searched forums and all this kind of thing is not there ... If we have a cluster then might be Kafka Server has a backup option and it self heals from this behavior ... Just a theory On Tue, Jul 19, 2016, 7:57 AM Abhinav Solan

Re: Kafka Consumer stops consuming from a topic

2016-07-19 Thread Abhinav Solan
No, was monitoring the app at that time .. it was just sitting idle On Tue, Jul 19, 2016, 7:32 AM David Garcia <dav...@spiceworks.com> wrote: > Is it possible that your app is thrashing (i.e. FullGC’ing too much and > not processing messages)? > > -David > > On 7/19/16,

Re: Kafka Consumer stops consuming from a topic

2016-07-19 Thread Abhinav Solan
Hi Everyone, can anyone help me on this Thanks, Abhinav On Mon, Jul 18, 2016, 6:19 PM Abhinav Solan <abhinav.so...@gmail.com> wrote: > Hi Everyone, > > Here are my settings > Using Kafka 0.9.0.1, 1 instance (as we are testing things on a staging > environment) > Subs

Kafka Consumer stops consuming from a topic

2016-07-18 Thread Abhinav Solan
Hi Everyone, Here are my settings Using Kafka 0.9.0.1, 1 instance (as we are testing things on a staging environment) Subscribing to 4 topics from a single Consumer application with 4 threads Now the server keeps on working fine for a while, then after about 3-4 hrs or so, it stops consuming at

Starting Kafka Connector via JMX

2016-06-13 Thread Abhinav Solan
Hi Everyone, Is there a way to start Kafka Connector via JMX? Thanks, Abhinav

Kafka Consumer error handling

2016-05-17 Thread Abhinav Solan
Hi Everyone, I wanted to what is the best and secure way of error handling for KafkaConsumer. I am using confluent's recommended consumer implementation. my delivery semantics is at least once. I am switching off the auto commit as well. Or I should just switch on the auto commit. The thing is I

Re: Kafka Consumer rebalancing frequently

2016-05-13 Thread Abhinav Solan
Hi Sahitya, Try reducing max.partition.fetch.bytes in your consumer. Then also increase heartbeat.interval.ms, this might help in to delay the consumer rebalance of your inbound process is taking more time than this - Abhinav On Fri, May 13, 2016 at 5:42 AM sahitya agrawal

Re: Kafka Consumer consuming large number of messages

2016-05-05 Thread Abhinav Solan
ode look like, where you > are verifying/measuring this consumed size? > > -Jaikiran > On Thursday 05 May 2016 03:00 AM, Abhinav Solan wrote: > > Thanks a lot Jens for the reply. > > One thing is still unclear is this happening only when we set the > > max.partitions.fetc

Re: Kafka Consumer consuming large number of messages

2016-05-04 Thread Abhinav Solan
Thanks a lot Jens for the reply. One thing is still unclear is this happening only when we set the max.partitions.fetch.bytes to a higher value ? Because I am setting it quite lower at 8192 only instead, because I can control the size of the data coming in Kafka, so even after setting this value

Kafka Consumer consuming large number of messages

2016-05-04 Thread Abhinav Solan
Hi, I am using kafka-0.9.0.1 and have configured the Kafka consumer to fetch 8192 bytes by setting max.partition.fetch.bytes Here are the properties I am using props.put("bootstrap.servers", servers); props.put("group.id", "perf-test"); props.put("offset.storage", "kafka");