I'm very new to Storm, and have read various documentation but haven't
started using it.
I have a use case where I could potentially have many users producing data
points that are accumulated in one huge, single Kafka topic/Kinesis stream,
and I was going to use Storm to "route" per-user mini-streams coming from
this single huge stream to multiple processors.
I was wondering how this use case is typically handled. I was going to
create a topology (where the spout consumes the big stream) for each user's
mini-stream that is then pushed to some new derived stream in
Kinesis/Kafka, but this doesn't seem right, since there could be 100,000s,
if not 1,000,000s of users and I would be creating 1,000,000 topics.
Thanks in advance for any advice!