Ganglia does give you a cluster wide and per machine utilization of
resources, but i don't think it gives your per Spark Job. If you want to
build something from scratch then you can follow up like :
1. Login to the machine
2. Get the PIDs
3. For network IO per process, you can have a look at
Hi
I need the same through Java.
Doesn't the SPark API support this?
On Wed, Sep 17, 2014 at 2:48 AM, Akhil Das ak...@sigmoidanalytics.com
wrote:
Ganglia does give you a cluster wide and per machine utilization of
resources, but i don't think it gives your per Spark Job. If you want to
build
Hi
I need to get the CPU utilisation, RAM usage, Network IO and other metrics
using Java program. Can anyone help me on this?
Thanks
Shalish.
Not particularly related to Spark, but you can check out SIGAR API. It let's
you get CPU, Memory, Network, Filesystem and process based metrics.
Amit
On Sep 16, 2014, at 20:14, VJ Shalish vjshal...@gmail.com wrote:
Hi
I need to get the CPU utilisation, RAM usage, Network IO and other
Thank u for the response Amit.
So is it that, we cannot measure the CPU consumption, RAM usage of a spark
job through a Java program?
On Tue, Sep 16, 2014 at 11:23 PM, Amit kumarami...@gmail.com wrote:
Not particularly related to Spark, but you can check out SIGAR API. It
let's you get CPU,
Sorry for the confusion Team.
My requirement is to measure the CPU utilisation, RAM usage, Network IO and
other metrics of a SPARK JOB using Java program.
Please help on the same.
On Tue, Sep 16, 2014 at 11:23 PM, Amit kumarami...@gmail.com wrote:
Not particularly related to Spark, but you can