I am new to Kafka but I think I have a good use case for it. I am trying
to build daily counts of requests based on a number of different attributes
in a high throughput system (~1 million requests/sec. across all 8
servers). The different attributes are unbounded in terms of values, and
some wi
Actually it looks like the better way would be to output the counts to a
new topic then ingest that topic into the DB itself. Is that the correct
way?
On Fri, Mar 2, 2018 at 9:24 AM, Matt Daum wrote:
> I am new to Kafka but I think I have a good use case for it. I am trying
> to build
throughput with batching as compared to one request per msg.
> We have different kinds of msgs/topics and the individual "request" size
> varies from about 100 bytes to 1+ KB.
>
> On 3/2/18, 8:24 AM, "Matt Daum" wrote:
>
> I am new to Kafka but I think I
while you do your batching, kafka producer
> also tries to batch msgs to Kafka, and you will need to ensure you have
> enough buffer memory. However that's all configurable.
>
> Finally ensure you have the latest java updates and have kafka 0.10.2 or
> higher.
>
> Jayesh
>
t; *From: *"Thakrar, Jayesh"
> *Date: *Sunday, March 4, 2018 at 9:25 PM
> *To: *Matt Daum
>
> *Cc: *"users@kafka.apache.org"
> *Subject: *Re: Kafka Setup for Daily counts on wide array of keys
>
>
>
> I don’t have any experience/knowledge on the K
eak further?
>
>
>
> Thanks!
>
> Matt
>
>
>
> On Sun, Mar 4, 2018 at 11:23 PM, Thakrar, Jayesh <
> jthak...@conversantmedia.com> wrote:
>
> BTW - I did not mean to rule-out Aerospike as a possible datastore.
>
> Its just that I am not fam
balance between batch optimization and
> expedience.
>
>
>
> You may need to do some experiments to balance between system throughput,
> record size, batch size and potential batching delay for a given rate of
> incoming requests.
>
>
>
>
>
> *From: