But you can only have as many KafkaStreams as there are partitions, correct?

This was actually discussed on IRC yesterday. One solution mentioned
was to consume messages with one consumer thread and fill an internal
queue (e.g. java.util.concurrent.BlockingQueue) for N worker threads
to read from.

Sent from my phone

On Nov 21, 2012, at 1:20 AM, Neha Narkhede <neha.narkh...@gmail.com> wrote:

> David,
>
> One KafkaStream is meant to be iterated by a single thread. A better
> approach is to request higher number of streams
> from the Kafka consumer and let each process have its own KafkaStream.
>
> Thanks,
> Neha
>
> On Tue, Nov 20, 2012 at 9:40 PM, David Ross <dyr...@klout.com> wrote:
>> Hello,
>>
>> We want to process messages from a single KafkaStream in a number of
>> processes. Is it possible to have this code executing in multiple threads
>> against the same stream?
>>
>> for (message <- stream) {
>>  someBlockingOperation(message)
>> }
>>
>> The scaladocs mention thread safety, but some of the code seems fairly
>> stateful. I was wondering if anyone has experience with this or knows if it
>> will work?
>>
>>
>> Thanks,
>>
>> David

Reply via email to