Hello,

I am trying to implement metrics in a spark application (using Codahale's
MetricRegistry). I have been able to see metrics for the driver using code
similar to this:

class MyMetricSource extends Source{
      val metricRegistry = new MetricRegistry
      val sourceName = "example.metrics"

      metricRegistry.register(MetricRegistry.name("example","metric"),
        new Gauge[Int] {
          override def getValue: Int = {
            return counter.value
          }
        })
    }
val myMetrics = new VortexMetricSource
SparkEnv.get.metricsSystem.registerSource(myMetrics)


I am also trying to get metrics for the executors (for example, about the
rate at which they process entries). I would want to use something similar
to

class ExecutorSource extends Source {
      val metricRegistry = new MetricRegistry
      val sourceName = "executor.metrics"

      val lograte = metricRegistry.meter("rate")
}

val executorMetrics = new ExecutorSource
SparkEnv.get.metricsSystem.registerSource(executorMetrics)

rdd.map( x => {
      executorMetrics.lograte.mark
      x
}).map(...)

I have tried a few things so far, but have not yet been able to see these
metrics working anywhere (I can see the executor.metrics.rate metric on the
driver, but the rate is always zero, and I cannot see any such metrics
anywhere on the worker machines).

Does anyone have any pointers on how to produce executor metrics?

Thank you!
Issac

-- 
--
*Issac Buenrostro*
Software Engineer |
[email protected]
www.ooyala.com | blog <http://www.ooyala.com/blog> |
@ooyala<http://www.twitter.com/ooyala>

Reply via email to