summaryzb commented on a change in pull request #34749:
URL: https://github.com/apache/spark/pull/34749#discussion_r762750607
##########
File path:
core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala
##########
@@ -249,21 +249,18 @@ private[spark] class EventLoggingListener(
}
override def onExecutorMetricsUpdate(event:
SparkListenerExecutorMetricsUpdate): Unit = {
+ if (event.execId == SparkContext.DRIVER_IDENTIFIER) {
+ logEvent(event)
+ }
Review comment:
>
previously, it's controlled by `shouldLogStageExecutorMetrics` because of `a
single event for both driver and executor metrics update `.
firstly: `logEvent` of driver has no Inevitable relation with `a single
event for both driver and executor metrics update `, that's two different
things, we can log the driver event with `shouldLogStageExecutorMetrics`, and
also control the update semantics by the parameter on the same time.
secondary: `shouldLogStageExecutorMetrics` is kind of heavy operation, i
don't think it's a good idea to bind it with logging driver metric
thirdly: i don't see the worth of driver metric in
`stageExecutorMetrics`, the driver metric in it is never be used.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]