>>I will add a topic name in my kafka .So, is it possible to make storm
read data from that kafka topic when the cluster is running.

I assume that you are referring to making the Spout read the data from the
new topic that is created at runtime (and not really adding a new Spout
thereby changing the topology as that is not possible). If so, this is not
possible by the KafkaSpout that ships with Storm. You would have to extend
it or create your own Spout which can do that.

But why are you creating a new topic for every customer and expecting to
read from them as and when they are created. Assume for a moment even if
this is possible from KafkaSpout that ships with Storm, your down stream
bolt which stores data in to Mongodb has to be aware of which message
belongs to which user to appropriately store the record.

Why wouldn't you have only one topic and identify the user as part of the
message?


On Thu, Aug 6, 2015 at 10:24 PM, Ritesh Sinha <
[email protected]> wrote:

> I have a topology which runs in the following way :
> It reads data from the Kafka topic and pass it to storm.Storm processes
> the data and stores it into two different DBs(mongo & cassandra).
>
> Here, the kafka topic is the name of the customer also the database name
> in mongodb and cassandra is same as the name of the kafka topic.
>
> Now, Suppose i have submitted the topology and it is running and i get a
> new customer.
>
> I will add a topic name in my kafka .So, is it possible to make storm read
> data from that kafka topic when the cluster is running.
>
> Thanks
>

Reply via email to