Re: How to get total CPU consumption for Spark job

2015-08-07 Thread gen tang
Hi,

Spark UI or logs don't provide the situation of cluster. However, you can
use Ganglia to monitor the situation of cluster. In spark-ec2, there is an
option to install ganglia automatically.

If you use CDH, you can also use Cloudera manager.

Cheers
Gen


On Sat, Aug 8, 2015 at 6:06 AM, Xiao JIANG  wrote:

> Hi all,
>
>
> I was running some Hive/spark job on hadoop cluster.  I want to see how
> spark helps improve not only the elapsed time but also the total CPU
> consumption.
>
>
> For Hive, I can get the 'Total MapReduce CPU Time Spent' from the log when
> the job finishes. But I didn't find any CPU stats for Spark jobs from
> either spark log or web UI. Is there any place I can find the total CPU
> consumption for my spark job? Thanks!
>
>
> Here is the version info: Spark version 1.3.0 Using Scala version 2.10.4,
> Java 1.7.0_67
>
>
> Thanks!
>
> Xiao
>


How to get total CPU consumption for Spark job

2015-08-07 Thread Xiao JIANG
Hi all,
I was running some Hive/spark job on hadoop cluster.  I want to see how spark 
helps improve not only the elapsed time but also the total CPU consumption.
For Hive, I can get the 'Total MapReduce CPU Time Spent' from the log when the 
job finishes. But I didn't find any CPU stats for Spark jobs from either spark 
log or web UI. Is there any place I can find the total CPU consumption for my 
spark job? Thanks!
Here is the version info: Spark version 1.3.0 Using Scala version 2.10.4, Java 
1.7.0_67
Thanks!Xiao