Thanks,

Since i'm running in local mode,  i plan to pin down the JVM to a CPU with
taskset -cp <CPU> <PID>,  hopefully with this all the tasks should operate
on the specified CPU cores..

Thanks,
Sujeet

On Thu, Aug 4, 2016 at 8:11 PM, Daniel Darabos <
daniel.dara...@lynxanalytics.com> wrote:

> You could run the application in a Docker container constrained to one CPU
> with --cpuset-cpus (https://docs.docker.com/engine/reference/run/#/cpuset-
> constraint).
>
> On Thu, Aug 4, 2016 at 8:51 AM, Sun Rui <sunrise_...@163.com> wrote:
>
>> I don’t think it possible as Spark does not support thread to CPU
>> affinity.
>> > On Aug 4, 2016, at 14:27, sujeet jog <sujeet....@gmail.com> wrote:
>> >
>> > Is there a way we can run multiple tasks concurrently on a single core
>> in local mode.
>> >
>> > for ex :- i have 5 partition ~ 5 tasks, and only a single core , i want
>> these tasks to run concurrently, and specifiy them to use /run on a single
>> core.
>> >
>> > The machine itself is say 4 core, but i want to utilize only 1 core out
>> of it,.
>> >
>> > Is it possible ?
>> >
>> > Thanks,
>> > Sujeet
>> >
>>
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>>
>

Reply via email to