Re: Can I assign affinity for spark executor processes?

2016-09-19 Thread Xiaoye Sun
Hi Jakob,

Yes. you are right. I should use taskset when I start the *.sh scripts.

For more detail, I change the last line in ./sbin/start-slaves.sh on master
to this
"${SPARK_HOME}/sbin/slaves.sh" cd "${SPARK_HOME}" \; *"taskset" "0xffe"*
"${SPARK_HOME}/sbin/start-slave.sh"
"spark://$SPARK_MASTER_IP:$SPARK_MASTER_PORT", where 0xffe is the affinity
mask.

Thanks!

Best,
Xiaoye

On Tue, Sep 13, 2016 at 11:01 PM, Jakob Odersky  wrote:

> Hi Xiaoye,
> could it be that the executors were spawned before the affinity was
> set on the worker? Would it help to start spark worker with taskset
> from the beginning, i.e. "taskset [mask] start-slave.sh"?
> Workers in spark (standalone mode) simply create processes with the
> standard java process API. Unless there is something funky going on in
> the JRE, I don't see how spark could affect cpu affinity.
>
> regards,
> --Jakob
>
> On Tue, Sep 13, 2016 at 7:56 PM, Xiaoye Sun  wrote:
> > Hi,
> >
> > In my experiment, I pin one very important process on a fixed CPU. So the
> > performance of Spark task execution will be affected if the executors or
> the
> > worker uses that CPU. I am wondering if it is possible to let the Spark
> > executors not using a particular CPU.
> >
> > I tried to 'taskset -p [cpumask] [pid]' command to set the affinity of
> the
> > Worker process. However, the executor processes created by the worker
> > process don't inherit the same CPU affinity.
> >
> > Thanks!
> >
> > Best,
> > Xiaoye
>


Re: Can I assign affinity for spark executor processes?

2016-09-13 Thread Jakob Odersky
Hi Xiaoye,
could it be that the executors were spawned before the affinity was
set on the worker? Would it help to start spark worker with taskset
from the beginning, i.e. "taskset [mask] start-slave.sh"?
Workers in spark (standalone mode) simply create processes with the
standard java process API. Unless there is something funky going on in
the JRE, I don't see how spark could affect cpu affinity.

regards,
--Jakob

On Tue, Sep 13, 2016 at 7:56 PM, Xiaoye Sun  wrote:
> Hi,
>
> In my experiment, I pin one very important process on a fixed CPU. So the
> performance of Spark task execution will be affected if the executors or the
> worker uses that CPU. I am wondering if it is possible to let the Spark
> executors not using a particular CPU.
>
> I tried to 'taskset -p [cpumask] [pid]' command to set the affinity of the
> Worker process. However, the executor processes created by the worker
> process don't inherit the same CPU affinity.
>
> Thanks!
>
> Best,
> Xiaoye

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Can I assign affinity for spark executor processes?

2016-09-13 Thread Xiaoye Sun
Hi,

In my experiment, I pin one very important process on a fixed CPU. So the
performance of Spark task execution will be affected if the executors or
the worker uses that CPU. I am wondering if it is possible to let the Spark
executors not using a particular CPU.

I tried to 'taskset -p [cpumask] [pid]' command to set the affinity of the
Worker process. However, the executor processes created by the worker
process don't inherit the same CPU affinity.

Thanks!

Best,
Xiaoye


Can I assign affinity for spark executor processes?

2016-09-13 Thread Xiaoye Sun
Hi,

In my experiment, I pin one very important process on a fixed CPU. So the
performance of Spark task execution will be affected if the executors or
the worker uses that CPU. I am wondering if it is possible to let the Spark
executors not using a particular CPU.

I tried to 'taskset -p [cpumask] [pid]' command to set the affinity of the
Worker process. However, the executor processes created by the worker
process don't inherit the same CPU affinity.

Thanks!

Best,
Xiaoye