Re: [Spark Streaming] Distribute custom receivers evenly across excecutors

2014-06-02 Thread Guang Gao
the streaming job, like: ssc.sparkContext.makeRDD(1 to 1, 1).map(x = (x, 1)).reduceByKey(_ + _, 1000).collect() On Sun, Jun 1, 2014 at 6:06 PM, Guang Gao birdeey...@gmail.com wrote: Dear All, I'm running Spark Streaming (1.0.0) with Yarn (2.2.0) on a 10-node cluster. I setup 10 custom receivers

[Spark Streaming] Distribute custom receivers evenly across excecutors

2014-06-01 Thread Guang Gao
Dear All, I'm running Spark Streaming (1.0.0) with Yarn (2.2.0) on a 10-node cluster. I setup 10 custom receivers to hear from 10 data streams. I want one receiver per node in order to maximize the network bandwidth. However, if I set --executor-cores 4, the 10 receivers only run on 3 of the