As per the documentation : 

http://spark.apache.org/docs/latest/streaming-programming-guide.html#input-dstreams-and-receivers
 

"if you want to receive multiple streams of data in parallel in your
streaming application, you can create multiple input DStreams (discussed
further in the Performance Tuning section). This will create multiple
receivers which will simultaneously receive multiple data streams. But note
that a Spark worker/executor is a long-running task, hence it occupies one
of the cores allocated to the Spark Streaming application. Therefore, it is
important to remember that a Spark Streaming application needs to be
allocated enough cores (or threads, if running locally) to process the
received data, as well as to run the receiver(s)."

"it is important to remember that a Spark Streaming application needs to be
allocated enough cores" In this reference of "enough cores",  what will be
the minimum cores for a receiver in Spark Streaming ?

Can we say 2 cores per Receiver ? Kindly correct me to understand it. 






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-minimum-cores-for-a-Receiver-tp25307.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to