The partitions are ordered (in publication order) so there isn't an easy
way to split a single partition up amongst threads without breaking this,
which is the reason the consumer works that way. If you don't care about
order then just having a thread pool feed by a single iterator will work
just fine.

-Jay


On Wed, Nov 21, 2012 at 7:52 AM, David Ross <dyr...@klout.com> wrote:

> Yes, this was a follow up question. I was wondering if I could cheaply
> implement this by sharing the stream. Thanks for responses.
>
> On Wednesday, November 21, 2012, David Arthur wrote:
>
> > But you can only have as many KafkaStreams as there are partitions,
> > correct?
> >
> > This was actually discussed on IRC yesterday. One solution mentioned
> > was to consume messages with one consumer thread and fill an internal
> > queue (e.g. java.util.concurrent.BlockingQueue) for N worker threads
> > to read from.
> >
> > Sent from my phone
> >
> > On Nov 21, 2012, at 1:20 AM, Neha Narkhede <neha.narkh...@gmail.com
> <javascript:;>>
> > wrote:
> >
> > > David,
> > >
> > > One KafkaStream is meant to be iterated by a single thread. A better
> > > approach is to request higher number of streams
> > > from the Kafka consumer and let each process have its own KafkaStream.
> > >
> > > Thanks,
> > > Neha
> > >
> > > On Tue, Nov 20, 2012 at 9:40 PM, David Ross <dyr...@klout.com
> <javascript:;>>
> > wrote:
> > >> Hello,
> > >>
> > >> We want to process messages from a single KafkaStream in a number of
> > >> processes. Is it possible to have this code executing in multiple
> > threads
> > >> against the same stream?
> > >>
> > >> for (message <- stream) {
> > >>  someBlockingOperation(message)
> > >> }
> > >>
> > >> The scaladocs mention thread safety, but some of the code seems fairly
> > >> stateful. I was wondering if anyone has experience with this or knows
> > if it
> > >> will work?
> > >>
> > >>
> > >> Thanks,
> > >>
> > >> David
> >
>

Reply via email to