Re: Spark driver CPU usage

2017-03-01 Thread Yong Zhang
It won't control the cpu usage of Driver.


You should check out what CPUs are doing on your driver side. But I just want 
to make sure that you do know the full CPU usage on a 4 cores Linux box will be 
400%. So 100% really just make one core busy.


Driver does maintain the application web UI, and track all kinds of tasks 
statistics. So even if just a word count program, but if the source is huge, 
and generating thousands of tasks, then driver will be busy.


Yong



From: Phadnis, Varun 
Sent: Wednesday, March 1, 2017 7:57 AM
To: user@spark.apache.org
Subject: RE: Spark driver CPU usage

Does that configuration parameter affect the CPU usage of the driver? If it 
does, we have that property unchanged from its default value of "1" yet the 
same behaviour as before.

-Original Message-
From: Rohit Verma [mailto:rohit.ve...@rokittech.com]
Sent: 01 March 2017 06:08
To: Phadnis, Varun 
Cc: user@spark.apache.org
Subject: Re: Spark driver CPU usage

Use conf spark.task.cpus to control number of cpus to use in a task.

On Mar 1, 2017, at 5:41 PM, Phadnis, Varun  wrote:
>
> Hello,
>
> Is there a way to control CPU usage for driver when running applications in 
> client mode?
>
> Currently we are observing that the driver occupies all the cores. Launching 
> just 3 instances of driver of WordCount sample application concurrently on 
> the same machine brings the usage of its 4 core CPU to 100%. Is this expected 
> behaviour?
>
> Thanks.


-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



RE: Spark driver CPU usage

2017-03-01 Thread Phadnis, Varun
Does that configuration parameter affect the CPU usage of the driver? If it 
does, we have that property unchanged from its default value of "1" yet the 
same behaviour as before.
 
-Original Message-
From: Rohit Verma [mailto:rohit.ve...@rokittech.com] 
Sent: 01 March 2017 06:08
To: Phadnis, Varun 
Cc: user@spark.apache.org
Subject: Re: Spark driver CPU usage

Use conf spark.task.cpus to control number of cpus to use in a task.

On Mar 1, 2017, at 5:41 PM, Phadnis, Varun  wrote:
> 
> Hello,
>  
> Is there a way to control CPU usage for driver when running applications in 
> client mode?
>  
> Currently we are observing that the driver occupies all the cores. Launching 
> just 3 instances of driver of WordCount sample application concurrently on 
> the same machine brings the usage of its 4 core CPU to 100%. Is this expected 
> behaviour?
>  
> Thanks.


-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark driver CPU usage

2017-03-01 Thread Rohit Verma
Use conf spark.task.cpus to control number of cpus to use in a task.

On Mar 1, 2017, at 5:41 PM, Phadnis, Varun  wrote:
> 
> Hello,
>  
> Is there a way to control CPU usage for driver when running applications in 
> client mode?
>  
> Currently we are observing that the driver occupies all the cores. Launching 
> just 3 instances of driver of WordCount sample application concurrently on 
> the same machine brings the usage of its 4 core CPU to 100%. Is this expected 
> behaviour?
>  
> Thanks.


-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org