I haven't tried before, but it seems Ganglia could be the tool.
On Wed, Apr 2, 2014 at 6:40 PM, yxzhao <[email protected]> wrote: > Hi All, > > I am intrested in measure the total network I/O, cpu and memory > consumed by Spark job. I tried to find the related information in logs and > Web UI. But there seems no sufficient information. Could anyone give me any > suggestion? > Thanks very much in advance. > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Measure-the-Total-Network-I-O-Cpu-and-Memory-Consumed-by-Spark-Job-tp3668.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > -- Dachuan Huang Cellphone: 614-390-7234 2015 Neil Avenue Ohio State University Columbus, Ohio U.S.A. 43210
