[
https://issues.apache.org/jira/browse/SPARK-22561?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Cody Koeninger resolved SPARK-22561.
------------------------------------
Resolution: Not A Problem
> Dynamically update topics list for spark kafka consumer
> -------------------------------------------------------
>
> Key: SPARK-22561
> URL: https://issues.apache.org/jira/browse/SPARK-22561
> Project: Spark
> Issue Type: New Feature
> Components: DStreams
> Affects Versions: 2.1.0, 2.1.1, 2.2.0
> Reporter: Arun
>
> The Spark Streaming application should allow to add new topic after streaming
> context is intialized and DStream is started. This is very useful feature
> specially when business is working multi geography or multi business units.
> For example initially I have spark-kakfa consumer listening for topics:
> ["topic-1"."topic-2"] and after couple of days I have added new topics to
> kafka ["topic-3","topic-4"], now is there a way to update spark-kafka
> consumer topics list and ask spark-kafka consumer to consume data for updated
> list of topics without stopping sparkStreaming application or sparkStreaming
> context.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]