stczwd opened a new pull request #35185:
URL: https://github.com/apache/spark/pull/35185


   ### Why are the changes needed?
   There is no partition id in the current task metrics. It is difficult to 
track the task metrics of a specific partition or the stage metrics of 
processing data, especially when the stage was retried.
   ```
   class TaskData private[spark](
       val taskId: Long,
       val index: Int,
       val attempt: Int,
       val launchTime: Date,
       val resultFetchStart: Option[Date],
       @JsonDeserialize(contentAs = classOf[JLong])
       val duration: Option[Long],
       val executorId: String,
       val host: String,
       val status: String,
       val taskLocality: String,
       val speculative: Boolean,
       val accumulatorUpdates: Seq[AccumulableInfo],
       val errorMessage: Option[String] = None,
       val taskMetrics: Option[TaskMetrics] = None,
       val executorLogs: Map[String, String],
       val schedulerDelay: Long,
       val gettingResultTime: Long) 
   ```
   
   ### Does this PR introduce _any_ user-facing change?
   no
   
   ### How was this patch tested?
   add new tests
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to