Hey, Was wondering if I could create multiple spark stream contexts in my application (e.g instantiating a worker actor per topic and it has its own streaming context its own batch duration everything).
What are the caveats if any? What are the best practices? Have googled half heartedly on the same but the air isn't pretty much demystified yet. I could skim through something like http://stackoverflow.com/questions/29612726/how-do-you-setup-multiple-spark-streaming-jobs-with-different-batch-durations http://stackoverflow.com/questions/37006565/multiple-spark-streaming-contexts-on-one-worker Thanks in Advance! Sumit