Re: Registering custom metrics

2015-06-23 Thread Otis Gospodnetić
Hi,

Not sure if this will fit your needs, but if you are trying to
collect+chart some metrics specific to your app, yet want to correlate them
with what's going on in Spark, maybe Spark's performance numbers, you may
want to send your custom metrics to SPM, so they can be
visualized/analyzed/dashboarded along with your Spark metrics. See
http://sematext.com/spm/integrations/spark-monitoring.html for the Spark
piece and https://sematext.atlassian.net/wiki/display/PUBSPM/Custom+Metrics
for Custom Metrics.  If you use Coda Hale's metrics lib, that works, too,
there is a pluggable reported that will send Coda Hale metrics to SPM, too.

HTH.

Otis
--
Monitoring * Alerting * Anomaly Detection * Centralized Log Management
Solr  Elasticsearch Support * http://sematext.com/


On Mon, Jun 22, 2015 at 9:57 AM, dgoldenberg dgoldenberg...@gmail.com
wrote:

 Hi Gerard,

 Have there been any responses? Any insights as to what you ended up doing
 to
 enable custom metrics? I'm thinking of implementing a custom metrics sink,
 not sure how doable that is yet...

 Thanks.



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Registering-custom-metrics-tp17765p23426.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Registering custom metrics

2015-06-22 Thread Dmitry Goldenberg
Great, thank you, Silvio. In your experience, is there any way to instument
a callback into Coda Hale or the Spark consumers from the metrics sink?  If
the sink performs some steps once it has received the metrics, I'd like to
be able to make the consumers aware of that via some sort of a callback..

On Mon, Jun 22, 2015 at 10:14 AM, Silvio Fiorito 
silvio.fior...@granturing.com wrote:

 Sorry, replied to Gerard’s question vs yours.

 See here:

 Yes, you have to implement your own custom Metrics Source using the Code
 Hale library. See here for some examples:
 https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/source/JvmSource.scala

 https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/master/ApplicationSource.scala

 The source gets registered, then you have to configure a sink for it just
 as the JSON servlet you mentioned.

 I had done it in the past but don’t have the access to the source for that
 project anymore unfortunately.

 Thanks,
 Silvio






 On 6/22/15, 9:57 AM, dgoldenberg dgoldenberg...@gmail.com wrote:

 Hi Gerard,
 
 Have there been any responses? Any insights as to what you ended up doing
 to
 enable custom metrics? I'm thinking of implementing a custom metrics sink,
 not sure how doable that is yet...
 
 Thanks.
 
 
 
 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Registering-custom-metrics-tp17765p23426.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
 



Re: Registering custom metrics

2015-06-22 Thread dgoldenberg
Hi Gerard,

Have there been any responses? Any insights as to what you ended up doing to
enable custom metrics? I'm thinking of implementing a custom metrics sink,
not sure how doable that is yet...

Thanks.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Registering-custom-metrics-tp17765p23426.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Registering custom metrics

2015-06-22 Thread Silvio Fiorito
Hi Gerard,

Yes, you have to implement your own custom Metrics Source using the Code Hale 
library. See here for some examples: 
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/source/JvmSource.scala
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/master/ApplicationSource.scala

The source gets registered, then you have to configure a sink for it just as 
the JSON servlet you mentioned.

I had done it in the past but don’t have the access to the source for that 
project anymore unfortunately.

Thanks,
Silvio

From: Gerard Maas
Date: Thursday, October 30, 2014 at 4:53 PM
To: user, d...@spark.apache.orgmailto:d...@spark.apache.org
Subject: Registering custom metrics

vHi,

I've been exploring the metrics exposed by Spark and I'm wondering whether 
there's a way to register job-specific metrics that could be exposed through 
the existing metrics system.

Would there be an  example somewhere?

BTW, documentation about how the metrics work could be improved. I found out 
about the default servlet and the metrics/json/ endpoint on the code. I could 
not find any reference to that on the dedicated doc page [1]. Probably 
something I could contribute if there's nobody on that at the moment.

-kr, Gerard.

[1]   http://spark.apache.org/docs/1.1.0/monitoring.html#Metrics


Re: Registering custom metrics

2015-06-22 Thread Silvio Fiorito
Sorry, replied to Gerard’s question vs yours.

See here:

Yes, you have to implement your own custom Metrics Source using the Code Hale 
library. See here for some examples: 
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/source/JvmSource.scala
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/master/ApplicationSource.scala

The source gets registered, then you have to configure a sink for it just as 
the JSON servlet you mentioned.

I had done it in the past but don’t have the access to the source for that 
project anymore unfortunately.

Thanks,
Silvio






On 6/22/15, 9:57 AM, dgoldenberg dgoldenberg...@gmail.com wrote:

Hi Gerard,

Have there been any responses? Any insights as to what you ended up doing to
enable custom metrics? I'm thinking of implementing a custom metrics sink,
not sure how doable that is yet...

Thanks.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Registering-custom-metrics-tp17765p23426.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Registering Custom metrics [Spark-Streaming-monitoring]

2015-05-28 Thread Snehal Nagmote
Hello All,

I am using spark streaming 1.3 . I want to capture few custom metrics based
on accumulators, I followed somewhat similar to this approach ,

val instrumentation = new SparkInstrumentation(example.metrics)
  * val numReqs = sc.accumulator(0L)
  * instrumentation.source.registerDailyAccumulator(numReqs, numReqs)
  * instrumentation.register()

https://gist.github.com/ibuenros/9b94736c2bad2f4b8e23

After registering metrics via accumulator , I am not able to see values of
these metrics on sink. I tried console sink , but still no luck

Do I need to set any properties in metrics.conf to enable this custom
source?


Also I noticed accumulators are displayed only in the task stages , It is
difficult to identify which task/stage on ui , that information would be
available .

Is there a way accumulators can be displayed on overall Job stats page ?

Any pointers/examples to achieve this would be helpful , spark monitoring
documentation is not very helpful,

Thanks in advance,

- Snehal


Re: Registering custom metrics

2015-01-08 Thread Gerard Maas
Very interesting approach. Thanks for sharing it!

On Thu, Jan 8, 2015 at 5:30 PM, Enno Shioji eshi...@gmail.com wrote:

 FYI I found this approach by Ooyala.

 /** Instrumentation for Spark based on accumulators.
   *
   * Usage:
   * val instrumentation = new SparkInstrumentation(example.metrics)
   * val numReqs = sc.accumulator(0L)
   * instrumentation.source.registerDailyAccumulator(numReqs, numReqs)
   * instrumentation.register()
   *
   * Will create and report the following metrics:
   * - Gauge with total number of requests (daily)
   * - Meter with rate of requests
   *
   * @param prefix prefix for all metrics that will be reported by this 
 Instrumentation
   */

 https://gist.github.com/ibuenros/9b94736c2bad2f4b8e23
 ᐧ

 On Mon, Jan 5, 2015 at 2:56 PM, Enno Shioji eshi...@gmail.com wrote:

 Hi Gerard,

 Thanks for the answer! I had a good look at it, but I couldn't figure out
 whether one can use that to emit metrics from your application code.

 Suppose I wanted to monitor the rate of bytes I produce, like so:

 stream
 .map { input =
   val bytes = produce(input)
   // metricRegistry.meter(some.metrics).mark(bytes.length)
   bytes
 }
 .saveAsTextFile(text)

 Is there a way to achieve this with the MetricSystem?


 ᐧ

 On Mon, Jan 5, 2015 at 10:24 AM, Gerard Maas gerard.m...@gmail.com
 wrote:

 Hi,

 Yes, I managed to create a register custom metrics by creating an
  implementation  of org.apache.spark.metrics.source.Source and
 registering it to the metrics subsystem.
 Source is [Spark] private, so you need to create it under a org.apache.spark
 package. In my case, I'm dealing with Spark Streaming metrics, and I
 created my CustomStreamingSource under org.apache.spark.streaming as I
 also needed access to some [Streaming] private components.

 Then, you register your new metric Source on the Spark's metric system,
 like so:

 SparkEnv.get.metricsSystem.registerSource(customStreamingSource)

 And it will get reported to the metrics Sync active on your system. By
 default, you can access them through the metric endpoint:
 http://driver-host:ui-port/metrics/json

 I hope this helps.

 -kr, Gerard.






 On Tue, Dec 30, 2014 at 3:32 PM, eshioji eshi...@gmail.com wrote:

 Hi,

 Did you find a way to do this / working on this?
 Am trying to find a way to do this as well, but haven't been able to
 find a
 way.



 --
 View this message in context:
 http://apache-spark-developers-list.1001551.n3.nabble.com/Registering-custom-metrics-tp9030p9968.html
 Sent from the Apache Spark Developers List mailing list archive at
 Nabble.com.

 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org







Re: Registering custom metrics

2015-01-08 Thread Enno Shioji
FYI I found this approach by Ooyala.

/** Instrumentation for Spark based on accumulators.
  *
  * Usage:
  * val instrumentation = new SparkInstrumentation(example.metrics)
  * val numReqs = sc.accumulator(0L)
  * instrumentation.source.registerDailyAccumulator(numReqs, numReqs)
  * instrumentation.register()
  *
  * Will create and report the following metrics:
  * - Gauge with total number of requests (daily)
  * - Meter with rate of requests
  *
  * @param prefix prefix for all metrics that will be reported by this
Instrumentation
  */

https://gist.github.com/ibuenros/9b94736c2bad2f4b8e23
ᐧ

On Mon, Jan 5, 2015 at 2:56 PM, Enno Shioji eshi...@gmail.com wrote:

 Hi Gerard,

 Thanks for the answer! I had a good look at it, but I couldn't figure out
 whether one can use that to emit metrics from your application code.

 Suppose I wanted to monitor the rate of bytes I produce, like so:

 stream
 .map { input =
   val bytes = produce(input)
   // metricRegistry.meter(some.metrics).mark(bytes.length)
   bytes
 }
 .saveAsTextFile(text)

 Is there a way to achieve this with the MetricSystem?


 ᐧ

 On Mon, Jan 5, 2015 at 10:24 AM, Gerard Maas gerard.m...@gmail.com
 wrote:

 Hi,

 Yes, I managed to create a register custom metrics by creating an
  implementation  of org.apache.spark.metrics.source.Source and
 registering it to the metrics subsystem.
 Source is [Spark] private, so you need to create it under a org.apache.spark
 package. In my case, I'm dealing with Spark Streaming metrics, and I
 created my CustomStreamingSource under org.apache.spark.streaming as I
 also needed access to some [Streaming] private components.

 Then, you register your new metric Source on the Spark's metric system,
 like so:

 SparkEnv.get.metricsSystem.registerSource(customStreamingSource)

 And it will get reported to the metrics Sync active on your system. By
 default, you can access them through the metric endpoint:
 http://driver-host:ui-port/metrics/json

 I hope this helps.

 -kr, Gerard.






 On Tue, Dec 30, 2014 at 3:32 PM, eshioji eshi...@gmail.com wrote:

 Hi,

 Did you find a way to do this / working on this?
 Am trying to find a way to do this as well, but haven't been able to
 find a
 way.



 --
 View this message in context:
 http://apache-spark-developers-list.1001551.n3.nabble.com/Registering-custom-metrics-tp9030p9968.html
 Sent from the Apache Spark Developers List mailing list archive at
 Nabble.com.

 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org






Registering custom metrics

2014-10-30 Thread Gerard Maas
vHi,

I've been exploring the metrics exposed by Spark and I'm wondering whether
there's a way to register job-specific metrics that could be exposed
through the existing metrics system.

Would there be an  example somewhere?

BTW, documentation about how the metrics work could be improved. I found
out about the default servlet and the metrics/json/ endpoint on the code. I
could not find any reference to that on the dedicated doc page [1].
Probably something I could contribute if there's nobody on that at the
moment.

-kr, Gerard.

[1]   http://spark.apache.org/docs/1.1.0/monitoring.html#Metrics