I was just doing the math (rough estimates). With 10K keys, after we pushed 
7,500 record, we have worst case 25% chance to use new key (implying high prop 
to get updates). In the last phase, with 15000K record I would assume that 
probability drops significantly and updates are more likely. To keep the "new 
key" probability high, going with 15K keys would be better not still relatively 
low, IMHO. With 20K keys, we have at least 25% change to hitting a new key, 
what seems more reasonable to me.

[ Full content available at: https://github.com/apache/kafka/pull/5640 ]
This message was relayed via gitbox.apache.org for [email protected]

Reply via email to