many consumers and groups within the data
I agree. The only reason I can think of for the custom partitioning route would
be if your group concept were to grow to a point where a topic-per-category
strategy become prohibitive. This seems unlikely based on what you’ve said. I
should also add that To
I agree. The only reason I can think of for the custom partitioning route would
be if your group concept were to grow to a point where a topic-per-category
strategy become prohibitive. This seems unlikely based on what you’ve said. I
should also add that Todd is spot on regarding the SimpleConsu
To add a little more context to Shaun's question, we have around 400
customers. Each customer has a stream of events. Some customers generate a
lot of data while others don't. We need to ensure that each customer's data
is sorted globally by timestamp.
We have two use cases around consumption:
1.
So I disagree with the idea to use custom partitioning, depending on your
requirements. Having a consumer consume from a single partition is not
(currently) that easy. If you don't care which consumer gets which
partition (group), then it's not that bad. You have 20 partitions, you have
20 consumer
pression that having 1000 topics with 1 partition
incurs the same load/costs on the kafka brokers that 1 topic with 1000
partitions has.
Shaun
From: Ben Stopford
Sent: September 30, 2015 9:06 AM
To: users@kafka.apache.org
Subject: Re: number of topics given
Hi Shaun
You might consider using a custom partition assignment strategy to push your
different “groups" to different partitions. This would allow you walk the
middle ground between "all consumers consume everything” and “one topic per
consumer” as you vary the number of partitions in the topic
Hi
I heave read Jay Kreps post regarding the number of topics that can be handled
by a broker
(https://www.quora.com/How-many-topics-can-be-created-in-Apache-Kafka), and it
has left me with more questions that I dont see answered anywhere else.
We have a data stream which will be consumed by