Yes. Since kafka takes the topic in the configuration, it means you will
have to add a new spout with different config. Either you resubmit the
topology (along with jar) or you can just have different topology for
different consumer.

On Fri, Aug 7, 2015 at 11:52 AM, Ritesh Sinha <
[email protected]> wrote:

> Do you mean creating a new jar for different customers and deploying it on
> the cluster?
>
> On Fri, Aug 7, 2015 at 11:45 AM, Abhishek Agarwal <[email protected]>
> wrote:
>
>> You will have to re-deploy your topology, with a new kafka spout for
>> another topic.
>>
>> On Fri, Aug 7, 2015 at 10:54 AM, Ritesh Sinha <
>> [email protected]> wrote:
>>
>>> I have a topology which runs in the following way :
>>> It reads data from the Kafka topic and pass it to storm.Storm processes
>>> the data and stores it into two different DBs(mongo & cassandra).
>>>
>>> Here, the kafka topic is the name of the customer also the database name
>>> in mongodb and cassandra is same as the name of the kafka topic.
>>>
>>> Now, Suppose i have submitted the topology and it is running and i get a
>>> new customer.
>>>
>>> I will add a topic name in my kafka .So, is it possible to make storm
>>> read data from that kafka topic when the cluster is running.
>>>
>>> Thanks
>>>
>>
>>
>>
>> --
>> Regards,
>> Abhishek Agarwal
>>
>>
>


-- 
Regards,
Abhishek Agarwal

Reply via email to