Hi all,

I'm looking for a way to measure the current memory / cpu usage of a spark
application to provide users feedback how much resources are actually being
used.
It seems that the metric system provides this information to some extend.
It logs metrics on application level (nr of cores granted) and on the JVM
level (memory usage).
Is this the recommended way to gather this kind of information? If so, how
do i best map a spark application to the corresponding JVM processes?

If not, should i rather request this information from the resource manager
(e.g. Mesos/YARN)?

thanks,
 Peter

-- 
Peter Prettenhofer

Reply via email to