AngersZhuuuu commented on a change in pull request #28034: 
[SPARK-31268][CORE]Initial Task Executor Metrics with latestMetrics
URL: https://github.com/apache/spark/pull/28034#discussion_r405221154
 
 

 ##########
 File path: 
core/src/main/scala/org/apache/spark/executor/ExecutorMetricsPoller.scala
 ##########
 @@ -107,8 +108,8 @@ private[spark] class ExecutorMetricsPoller(
    * Called by TaskRunner#run.
    */
   def onTaskStart(taskId: Long, stageId: Int, stageAttemptId: Int): Unit = {
-    // Put an entry in taskMetricPeaks for the task.
-    taskMetricPeaks.put(taskId, new 
AtomicLongArray(ExecutorMetricType.numMetrics))
+    // Put an entry in taskMetricPeaks for the task with latestMetrics.
 
 Review comment:
   > I guess your issue is the metrics didn't get updated because `poll()` was 
not triggered? In that case why not update metrics snapshot here or 
`onTaskCompletion`?
   
   The issue is during task start and start end, metric poller don't call 
poll() if task during less 
    then poll interval. then in task completion events, task's executor metrics 
will be sent with initial 0 value.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to