Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/22218#discussion_r214544279
--- Diff:
core/src/main/scala/org/apache/spark/executor/ExecutorSource.scala ---
@@ -73,6 +76,28 @@ class ExecutorSource(threadPool: ThreadPoolExecutor,
executorId: String) extends
registerFileSystemStat(scheme, "write_ops", _.getWriteOps(), 0)
}
+ // Dropwizard metrics gauge measuring the executor's process CPU time.
+ // This Gauge will try to get and return the JVM Process CPU time or
return -1 otherwise.
+ // The CPU time value is returned in nanoseconds.
+ // It will use proprietary extensions such as
com.sun.management.OperatingSystemMXBean or
+ // com.ibm.lang.management.OperatingSystemMXBean, if available.
+ metricRegistry.register(MetricRegistry.name("jvmCpuTime"), new
Gauge[Long] {
--- End diff --
So this isn't exposed except through dropwizard... not plumbed through to
the driver too like some of the metrics below? just checking that this is all
that needs to happen, that the metric can be used by external users but is not
otherwise touched by Spark.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]