Re: CPU RAM

2014-09-17 Thread Akhil Das
Ganglia does give you a cluster wide and per machine utilization of resources, but i don't think it gives your per Spark Job. If you want to build something from scratch then you can follow up like : 1. Login to the machine 2. Get the PIDs 3. For network IO per process, you can have a look at

Re: CPU RAM

2014-09-17 Thread VJ Shalish
Hi I need the same through Java. Doesn't the SPark API support this? On Wed, Sep 17, 2014 at 2:48 AM, Akhil Das ak...@sigmoidanalytics.com wrote: Ganglia does give you a cluster wide and per machine utilization of resources, but i don't think it gives your per Spark Job. If you want to build

CPU RAM

2014-09-16 Thread VJ Shalish
Hi I need to get the CPU utilisation, RAM usage, Network IO and other metrics using Java program. Can anyone help me on this? Thanks Shalish.

Re: CPU RAM

2014-09-16 Thread Amit
Not particularly related to Spark, but you can check out SIGAR API. It let's you get CPU, Memory, Network, Filesystem and process based metrics. Amit On Sep 16, 2014, at 20:14, VJ Shalish vjshal...@gmail.com wrote: Hi I need to get the CPU utilisation, RAM usage, Network IO and other

Re: CPU RAM

2014-09-16 Thread VJ Shalish
Thank u for the response Amit. So is it that, we cannot measure the CPU consumption, RAM usage of a spark job through a Java program? On Tue, Sep 16, 2014 at 11:23 PM, Amit kumarami...@gmail.com wrote: Not particularly related to Spark, but you can check out SIGAR API. It let's you get CPU,

Re: CPU RAM

2014-09-16 Thread VJ Shalish
Sorry for the confusion Team. My requirement is to measure the CPU utilisation, RAM usage, Network IO and other metrics of a SPARK JOB using Java program. Please help on the same. On Tue, Sep 16, 2014 at 11:23 PM, Amit kumarami...@gmail.com wrote: Not particularly related to Spark, but you can