rezasafi commented on a change in pull request #23306: [SPARK-26357][Core] 
Expose executors' procfs metrics to Metrics system
URL: https://github.com/apache/spark/pull/23306#discussion_r242354085
 
 

 ##########
 File path: 
core/src/main/scala/org/apache/spark/executor/ProcfsMetricsSource.scala
 ##########
 @@ -19,14 +19,15 @@ package org.apache.spark.executor
 
 import com.codahale.metrics.{Gauge, MetricRegistry}
 
+import org.apache.spark.SparkEnv
 import org.apache.spark.internal.config
 import org.apache.spark.metrics.source.Source
-import org.apache.spark.SparkEnv
 
 private[executor] class ProcfsMetricsSource extends Source {
   override val sourceName = "procfs"
-  override val metricRegistry = new MetricRegistry()
+  // We use numMetrics for tracking to only call computAllMetrics once per set 
of metrics
 
 Review comment:
   I think this is good to be done here to avoid calling 
procfsMetricsGetter.computeAllMetrics to compute the same set of metrics 
multiple times. I think we had this discusion in other review as well, but 
there we removed the need for this by changing the ExecutorMetricType API. Here 
we can't change the dropwizard API

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to