Clearing saved kerberos credentials on login failure

2018-09-13 Thread Tyler Monahan
Hello, I am having an issue with kerberos auth when one of my brokers is lost and is replaced by a new instance the running brokers/consumers/producers are still trying to use the authentication information for the original broker to login to the new broker. This leaves me in a state of

Re: Low level kafka consumer API to KafkaStreams App.

2018-09-13 Thread Svante Karlsson
You are doing something wrong if you need 10k threads to produce 800k messages per second. It feels you are a factor of 1000 off. What size are your messages? On Thu, Sep 13, 2018, 21:04 Praveen wrote: > Hi there, > > I have a kafka application that uses kafka consumer low-level api to help >

Low level kafka consumer API to KafkaStreams App.

2018-09-13 Thread Praveen
Hi there, I have a kafka application that uses kafka consumer low-level api to help us process data from a single partition concurrently. Our use case is to send out 800k messages per sec. We are able to do that with 4 boxes using 10k threads and each request taking 50ms in a thread.

Re: 90 open file handles for one segment

2018-09-13 Thread Matt Kocubinski
Well the validation is nice. Only seeing in this our test cluster (not production), but still no closer to a cause/resolution. On Tue, Sep 11, 2018 at 4:04 PM Tyler Monahan wrote: > Matt, > > > I am seeing similar behavior with kafka 1.1.0 with 80 copies of the same > file being open. I am

Re: Need info

2018-09-13 Thread James Kwan
There are different customers getting data from zOS like DB2 or IMS to Kafka. They have written their own consumers to load the data into data warehouse like Teradata or other data warehouse platform. I am not sure if there is a particular customer using MongoDB, but there are some using

RE: Timing state changes?

2018-09-13 Thread Tim Ward
From: John Roesler > As you noticed, a windowed computation won't work here, because you would > be wanting to alert on things that are absent from the window. > Instead, you can use a custom Processor with a Key/Value store and schedule > punctuations to send the alerts. For example, you can

Understanding default.deserialization.exception.handler

2018-09-13 Thread Tim Ward
With props.put(StreamsConfig.DEFAULT_DESERIALIZATION_EXCEPTION_HANDLER_CLASS_CONFIG, LogAndContinueExceptionHandler.class); Scenario A: Run application. Feed a message into the topic that will fail deserialization. Application logs exception and keeps running. Shut down application.

Re: SAM Scala aggregate

2018-09-13 Thread Michael Eugene
Thanks John. I thought I was already doing that but there were places that I wasn't doing that. Your comment forced me to go back and check to make sure I was add the "_root_" and I noticed all the places I hadn't added it. It builds now, thanks! From: John