Hi,

I'd like to know if there's a possibility to get spark's metrics from code.
For example

  val sc = new SparkContext(conf)
  val result = myJob(sc, ...)
  result.save(...)

  val gauge = MetricSystem.getGauge("org.apahce.spark....")
  println(gauge.getValue)  // or send to to internal aggregation service

I'm aware that there's a configuration for sending metrics to several kinds
of sinks but I'm more interested in a per job basis style and we use a
custom log/metric aggregation service for building dashboards.

Thanks.
-- 
*JU Han*

Software Engineer @ Teads.tv

+33 0619608888

Reply via email to