sarutak commented on a change in pull request #30573:
URL: https://github.com/apache/spark/pull/30573#discussion_r544849995



##########
File path: core/src/main/scala/org/apache/spark/status/AppStatusListener.scala
##########
@@ -687,6 +687,9 @@ private[spark] class AppStatusListener(
         stage.killedSummary = killedTasksSummary(event.reason, 
stage.killedSummary)
       }
       stage.activeTasksPerExecutor(event.taskInfo.executorId) -= 1
+
+      stage.executorSummary(event.taskInfo.executorId).peakExecutorMetrics
+        .compareAndUpdatePeakValues(event.taskExecutorMetrics)

Review comment:
       > It seems that the first metrics of peakExecutorMetrics become 0 
instead of -1 after this. @AngersZhuuuu Do you know the reason?
   
   @gengliangwang 
   By this change, `peakExecutorMetrics` is updated not only 
`onExecutorMetricsUpdate` but also `onTaskEnd`.
   So, the peak value carried by `SparkListenerTaskEnd` is `0`, the 
corresponding peak values in `peakExecutorMetrics` is set to `0`.
   Do you have any concern?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to