Hi Team,
I am trying one use case using Spark Streaming and I am not sure If I can solve 
it using spark.

My spark stream will listen to multiple Kafka topics where each topic will 
receives various counters with diff values.
I need to process multiple (around 200) KPI expressions using those counters 
and publish back onto Kafka.
My problem is, to calculate one particular KPI, I am not sure in which batch 
those counters will be present. And I need to calculate such 200 KPIs.
I am thinking about Structured Streaming to keep the state by unable to fit the 
solution into it.
Please help.




Thanks,
Aniket

Reply via email to