LucaCanali commented on a change in pull request #26170: [SPARK-29397][core] 
Extend plugin interface to include the driver.
URL: https://github.com/apache/spark/pull/26170#discussion_r338781246
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/SparkContext.scala
 ##########
 @@ -539,6 +541,9 @@ class SparkContext(config: SparkConf) extends Logging {
     _heartbeatReceiver = env.rpcEnv.setupEndpoint(
       HeartbeatReceiver.ENDPOINT_NAME, new HeartbeatReceiver(this))
 
+    // Initialize any plugins before the task scheduler is initialized.
+    _plugins = PluginContainer(this)
 
 Review comment:
   There are obvious advantages to initialize the driver plugin at this early 
stage, however this is not an ideal point for registering metrics (for those 
plugins that want to do so), as the metrics source should ideally be registered 
with _env.metricsSystem which is only started at later point, after the task 
scheduler has been started. As it is now, driver plugin metrics do not get the 
application id, so they are difficult to consume. 
   How about, for example, splitting the driver plugin code, so that the 
metrics registration part (if any is needed) can take advantage of registering 
with env.metricsSystem.registerSource(...)?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to