dongjoon-hyun commented on code in PR #41077:
URL: https://github.com/apache/spark/pull/41077#discussion_r1187681984
##########
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala:
##########
@@ -1086,11 +1086,19 @@ private[spark] class TaskSchedulerImpl(
case ExecutorKilled =>
logInfo(s"Executor $executorId on $hostPort killed by driver.")
case _: ExecutorDecommission =>
- logInfo(s"Executor $executorId on $hostPort is decommissioned.")
+ logInfo(s"Executor $executorId on $hostPort is decommissioned after " +
+ s"${getDecommissionDuration(executorId)}.")
case _ =>
logError(s"Lost executor $executorId on $hostPort: $reason")
}
+ // return decommission duration in string or "unknown time" if decommission
startTime not exists
+ private def getDecommissionDuration(executorId: String): String = {
+ executorsPendingDecommission.get(executorId)
+ .map(s => Utils.msDurationToString(clock.getTimeMillis() - s.startTime))
+ .getOrElse("unknown time")
Review Comment:
So, the log message goes with `... is decommissioned after unknown time`?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]