Hi,
Could anybody please guide me how to get application or job level counters
for CPU and Memory in Spark 2.0.0 using REST API.
I have explored the API's at
http://spark.apache.org/docs/latest/monitoring.html
but did not find anything similar to what MR provides, see the link below:
(
http://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapredAppMasterRest.html#Job_Counters_API
)

Looking forward for quick help
Regards

Reply via email to