kstream transform forward to different topics

2019-02-06 Thread Nan Xu
when I do the transform, for a single input record, I need to output 3 different records, those 3 records are in different classes. I want to send the each type of records to a separate topic, my understanding is I should use context.forward inside the transformer like Transformer{..

Re: measuring incoming bytes

2019-02-06 Thread Christopher Bogan
Bytespersec online On Wed, Feb 6, 2019, 12:22 PM Amitav Mohanty Which part? > > -Amitav > > On Mon, Feb 4, 2019 at 8:42 PM Christopher Bogan < > ambitiousking...@gmail.com> > wrote: > > > Correct > > > > On Sat, Feb 2, 2019, 10:09 AM Amitav Mohanty > wrote: > > > > > Hi > > > > > > I am trying

Re: Minimizing global store restoration time

2019-02-06 Thread Taylor P
Hi Patrik, I am not sure that https://issues.apache.org/jira/browse/KAFKA-7380 will resolve this issue since our application is dependent on the global store being fully restored before the application can be considered healthy. It does not seem like KAFKA-7380 is aiming to address the nature of

Re: Kafka exactly-once with multiple producers

2019-02-06 Thread Tim Jiang
Thanks Matthias, this is exactly the answer I was looking for! On Tue, Feb 5, 2019 at 11:26 PM Matthias J. Sax wrote: > Each producer will need to use it's own `transactional.id`. Otherwise, > one producer would fence-off and "block" the other. > > Both producers can start transactions

Re: measuring incoming bytes

2019-02-06 Thread Amitav Mohanty
Which part? -Amitav On Mon, Feb 4, 2019 at 8:42 PM Christopher Bogan wrote: > Correct > > On Sat, Feb 2, 2019, 10:09 AM Amitav Mohanty wrote: > > > Hi > > > > I am trying to measure incoming bytes over time. I am trying collect the > > following metric and apply integral function over a set

Kafka streams exactly_once auto commit timeout transaction issue

2019-02-06 Thread Xander Uiterlinden
Hi, I'm trying to get a fairly simple example of using Kafka Streams with exactly once processing to work. I defined a setup where messages are being read from an input topic and two streams transform and output the result to their own output topic. In normal conditions this works fine, i.e. when