Re: Spark Streaming : Limiting number of receivers per executor

2016-02-10 Thread Shixiong(Ryan) Zhu
You can't. The number of cores must be great than the number of receivers.

On Wed, Feb 10, 2016 at 2:34 AM, ajay garg <ajay.g...@mobileum.com> wrote:

> Hi All,
>  I am running 3 executors in my spark streaming application with 3
> cores per executors. I have written my custom receiver for receiving
> network
> data.
>
> In my current configuration I am launching 3 receivers , one receiver per
> executor.
>
> In the run if 2 of my executor dies, I am left with only one executor and
> all 3 receivers are scheduled on that executor. Since this executor has
> only
> 3 cores and all cores are busy running 3 receivers, Action on accumulated
> window data(DStream) is not scheduled and my application hangs.
>
> Is there a way to restrict number of receivers per executor so that I am
> always left with some core to run action on DStream.
>
> Thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Limiting-number-of-receivers-per-executor-tp26192.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Spark Streaming : Limiting number of receivers per executor

2016-02-10 Thread Yogesh Mahajan
Hi Ajay,

Have you overridden Receiver#preferredLocation method in your custom
Receiver? You can specify hostname for your Receiver. Check the
ReceiverSchedulingPolicy#scheduleReceivers, it should honor your
preferredLocation value for Receiver scheduling.


On Wed, Feb 10, 2016 at 4:04 PM, ajay garg <ajay.g...@mobileum.com> wrote:

> Hi All,
>  I am running 3 executors in my spark streaming application with 3
> cores per executors. I have written my custom receiver for receiving
> network
> data.
>
> In my current configuration I am launching 3 receivers , one receiver per
> executor.
>
> In the run if 2 of my executor dies, I am left with only one executor and
> all 3 receivers are scheduled on that executor. Since this executor has
> only
> 3 cores and all cores are busy running 3 receivers, Action on accumulated
> window data(DStream) is not scheduled and my application hangs.
>
> Is there a way to restrict number of receivers per executor so that I am
> always left with some core to run action on DStream.
>
> Thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Limiting-number-of-receivers-per-executor-tp26192.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Spark Streaming : Limiting number of receivers per executor

2016-02-10 Thread ajay garg
Hi All,
 I am running 3 executors in my spark streaming application with 3
cores per executors. I have written my custom receiver for receiving network
data.

In my current configuration I am launching 3 receivers , one receiver per
executor.

In the run if 2 of my executor dies, I am left with only one executor and
all 3 receivers are scheduled on that executor. Since this executor has only
3 cores and all cores are busy running 3 receivers, Action on accumulated
window data(DStream) is not scheduled and my application hangs.

Is there a way to restrict number of receivers per executor so that I am
always left with some core to run action on DStream.

Thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Limiting-number-of-receivers-per-executor-tp26192.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org