vanzin commented on a change in pull request #24901: [SPARK-28091[CORE] Extend
Spark metrics system with user-defined metrics using executor plugins
URL: https://github.com/apache/spark/pull/24901#discussion_r319584490
##########
File path: core/src/main/java/org/apache/spark/ExecutorPlugin.java
##########
@@ -47,6 +48,17 @@
*/
default void init() {}
+ /**
+ * Initialize the executor plugins used to extend the Spark/Dropwizard
metrics system.
+ *
+ * <p>Each executor will, during its initialization, invoke this method on
each
+ * plugin provided in the spark.executor.metrics.plugins configuration.</p>
+ *
+ * <p>Plugins should register the data sources using the Dropwizard/codahale
API</p>
+ *
+ */
+ default void init(MetricRegistry sourceMetricsRegistry) {}
Review comment:
I just want to add a new optional metrics source to Spark, and plugging into
this functionality seemed nicer than having ad-hoc code. It would also mean
this API would be used also inside of Spark.
> just to pass sourceMetricsregistry to all plugins init code
Either is fine, it's a `DeveloperApi` so we can make those changes. But, for
future proofing, I'd add a level of indirection - add a `ExecutorPluginContext`
class that holds references to things provided to the init method (such as the
metrics registry). Then adding things to that class does not break the API.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]