I'm new to Spark. From my experience when I use a"single StreamingContext to create different input streams from different sources" I get multiple errors and problems down stream. This seems like it is not the way to go. From what I read creating multiple StreamingContext is not advised. It appears to be "heavyweight" and OOM can occur. Creating a pool of StreamingContext seems not an elegant solution. I'm in a bind as well as to the best approach for multiple sources.
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-Spark-Streaming-receiver-model-tp21002p21215.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org