Hi, We have a requirement of receiving live input messages from RabbitMQ and process them into micro batches. For this we have selected SparkStreaming and we have written a connector for RabbitMQ receiver and Spark streaming, it is working fine.
Now the main requirement is to receive different category of events from different channels/queues in a spark streaming context. Q1 : So how can I create different streaming context to receive message from different source (may or may not be in same frequency) within a Spark context? Q2: Is is advisable to use single StreamingContext to create different input streams from different sources? Q3: What all design consideration I need to take care in terms of specifying no of Cores for Spark master. Q4: Is there a was to distribute different Streaming receiving task to different workers ? You suggestions will be very valuable in my application design and if you can have some example for complex application which you can share may some URL reference or whatever ; it is very appreciated. Thanks Manjul -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-Spark-Streaming-receiver-model-tp21002.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org