Re: Get spark metrics in code

2016-09-11 Thread Steve Loughran

> On 9 Sep 2016, at 13:20, Han JU  wrote:
> 
> Hi,
> 
> I'd like to know if there's a possibility to get spark's metrics from code. 
> For example
> 
>   val sc = new SparkContext(conf)
>   val result = myJob(sc, ...)
>   result.save(...)
>   
>   val gauge = MetricSystem.getGauge("org.apahce.spark")
>   println(gauge.getValue)  // or send to to internal aggregation service
> 
> I'm aware that there's a configuration for sending metrics to several kinds 
> of sinks but I'm more interested in a per job basis style and we use a custom 
> log/metric aggregation service for building dashboards.
> 

It's all coda hale metrics; should be retrievable somehow, for a loose 
definition of "somehow"

I'd be interested in knowing what you come up with here. 


-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Get spark metrics in code

2016-09-09 Thread Han JU
Hi,

I'd like to know if there's a possibility to get spark's metrics from code.
For example

  val sc = new SparkContext(conf)
  val result = myJob(sc, ...)
  result.save(...)

  val gauge = MetricSystem.getGauge("org.apahce.spark")
  println(gauge.getValue)  // or send to to internal aggregation service

I'm aware that there's a configuration for sending metrics to several kinds
of sinks but I'm more interested in a per job basis style and we use a
custom log/metric aggregation service for building dashboards.

Thanks.
-- 
*JU Han*

Software Engineer @ Teads.tv

+33 061960