[ 
https://issues.apache.org/jira/browse/STORM-817?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14554933#comment-14554933
 ] 

Zhuo Liu edited comment on STORM-817 at 5/21/15 7:46 PM:
---------------------------------------------------------

This use case looks quite interesting and useful.
Thus the KafkaSpout needs to intermittently check zookeeper for matching 
topics, and then build new connection partitions. Some failure/security issues 
may need to be considered (e.g., some topics may match the wildcard but not 
really what the Spout needs).
In terms of implementation, the current ZkCoordinator and 
DynamicPartitionConnections can support dynamic partition/connection updates.
It would be great to have that ability in KafkaSpout too.
I guess the size of it will be reasonable. Sure.

This issue is related to storm-392:
A "default" high-level kafka consumer has the ability to read from multiple 
topics using a WhiteList notation (which is basically just a regex). 


was (Author: zhuoliu):
This use case looks quite interesting and useful.
Thus the KafkaSpout needs to intermittently check zookeeper for matching 
topics, and then build new connection partitions. Some failure/security issues 
may need to be considered (e.g., some topics may match the wildcard but not 
really what the Spout needs).
In terms of implementation, the current ZkCoordinator and 
DynamicPartitionConnections can support dynamic partition/connection updates. 
I guess the size of it will be reasonable. Sure.

> Kafka Wildcard Topic Support
> ----------------------------
>
>                 Key: STORM-817
>                 URL: https://issues.apache.org/jira/browse/STORM-817
>             Project: Apache Storm
>          Issue Type: New Feature
>          Components: storm-kafka
>            Reporter: Sumit Chawla
>
> Creating a feature request for supporting Wildcard Topic's for Kafka Spout.  
> We want to be able to run a aggregation stream for data coming from all 
> tenants. Tenants get added dynamically. So new kafka topics get created. All 
> the topics will be matching a regex pattern. 
> example:
> clickstream:tenant1:log
> clickstream:tenant2:log
> clickstream:tenant3:log
> Storm code should be able to perform auto-discovery, and should be able to to 
> fetch from newly created topics in run time.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to