How to calculating CPU time for a Spark Job? Is there any interface can be
directly call?

like the hadoop Map-Reduce Framework provider the CPU time spent(ms) in the
Counters.

thinks!

Reply via email to