AngersZhuuuu opened a new pull request #31992:
URL: https://github.com/apache/spark/pull/31992
### What changes were proposed in this pull request?
In current EventLoggingListener, we won't write
SparkListenerExecutorMetricsUpdate message at all
```
override def onExecutorMetricsUpdate(event:
SparkListenerExecutorMetricsUpdate): Unit = {
if (shouldLogStageExecutorMetrics) {
event.executorUpdates.foreach { case (stageKey1, newPeaks) =>
liveStageExecutorMetrics.foreach { case (stageKey2,
metricsPerExecutor) =>
// If the update came from the driver, stageKey1 will be the dummy
key (-1, -1),
// so record those peaks for all active stages.
// Otherwise, record the peaks for the matching stage.
if (stageKey1 == DRIVER_STAGE_KEY || stageKey1 == stageKey2) {
val metrics = metricsPerExecutor.getOrElseUpdate(
event.execId, new ExecutorMetrics())
metrics.compareAndUpdatePeakValues(newPeaks)
}
}
}
}
}
```
It cause we can't get driver executor peakMemoryMetrics in SHS at all since
there's no event about this. We can got executor's peakExecutorMetrics since it
will update with TaskEnd events's peakMemoryMetrics.
So in this pr, I add support to send
SparkListenerExecutorMetricsUpdateEventLog of `driver`.
### Why are the changes needed?
Make user can got driver's peakMemoryMetrics in SHS.
### Does this PR introduce _any_ user-facing change?
user can got driver's peakMemoryMetrics in SHS.
### How was this patch tested?
Mannul test
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]