LucaCanali commented on a change in pull request #28528:
URL: https://github.com/apache/spark/pull/28528#discussion_r430370219



##########
File path: core/src/main/scala/org/apache/spark/executor/Executor.scala
##########
@@ -134,8 +134,11 @@ private[spark] class Executor(
     env.metricsSystem.registerSource(new JVMCPUSource())
     executorMetricsSource.foreach(_.register(env.metricsSystem))
     env.metricsSystem.registerSource(env.blockManager.shuffleMetricsSource)
+  } else {
+    Executor.executorSource = executorSource

Review comment:
       Good question, this is the actual pain point this PR tries to solve: one 
cannot simply register metrics on env here when running in local mode, or 
otherwise the appId would not be available, so the resulting output would not 
be clearly usable (missing a key piece of info as the appId). That's why I 
propose to register the metrics in the Spark Context, together with other 
driver metrics. As you can see other metrics namespaces use a similar strategy.
   To get around the issue, for local mode, I propose using the workaround of 
storing the executorSource in the object so it can be read later, I hope this 
is acceptable.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to