HI, I would like to set up streaming from Kafka cluster, reading multiple topics and then processing each of the differently. So, I’d create a stream
val stream = KafkaUtils.createStream(ssc,"localhost:2181","logs", Map("retarget" -> 2,"datapair" -> 2)) And then based on whether it’s “retarget” topic or “datapair”, set up different filter function, map function, reduce function, etc. Is it possible ? I’d assume it should be, since ConsumerConnector can map of KafkaStreams keyed on topic, but I can’t find that it would be visible to Spark. Thank you, Sergey Malov