I have a similar use case and the same question.
In my use case, I have just a few hundreds of active topics at any given
time, but millions of topics created over time.
I do not want to keep the data in the topic, but delete it after all data
has been consumed.

I found this old conversation: https://github.com/apache/pulsar/issues/3302

I would recommend to run a test...write a small set of programs to act as
producer and other as consumer with dummy messages. Graph memory
consumption, connection issues, latency....

Lets see if anybody else has a similar use case.


On Thu, Feb 24, 2022 at 2:09 AM Johnny Miller <joh...@digitalis.io> wrote:

> Hi - I have a project I am at the design phase in, and it would be
> interesting to leverage the million topic scale in Pulsar. I have
> experienced problems in similar tech where these limits are
> more"hypothetical" rather than real.
>
> Is this realistic in prod/operationally? Not necessarily a lot of data,
> more just lots of topics.
>
> Any real stories, experiences or war wounds around this level of scale
> would be appreciated.
>
> Thanks,
>
> Johnny
>
>
> --
>
> The information contained in this electronic message and any attachments
> to this message are intended for the exclusive use of the addressee(s) and
> may contain proprietary, confidential or privileged information. If you are
> not the intended recipient, you should not disseminate, distribute or copy
> this e-mail. Please notify the sender immediately and destroy all copies of
> this message and any attachments. WARNING: Computer viruses can be
> transmitted via email. The recipient should check this email and any
> attachments for the presence of viruses. The company accepts no liability
> for any damage caused by any virus transmitted by this email.
> www.digitalis.io
>

Reply via email to